sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fc61580fef836a90f3ea225ed5487a3dbf6e7d9f
|
# Dataset Card for "HQMR"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
danjacobellis/HQMR
|
[
"region:us"
] |
2023-08-17T20:05:01+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 65460837045.38, "num_examples": 177180}], "download_size": 66435478074, "dataset_size": 65460837045.38}}
|
2023-08-18T09:34:14+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "HQMR"
More Information needed
|
[
"# Dataset Card for \"HQMR\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"HQMR\"\n\nMore Information needed"
] |
[
6,
12
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"HQMR\"\n\nMore Information needed"
] |
8a1c8750bcaa8ccbba5d880a74e7b18642249968
|
# Dataset of inaba_tewi/ε εΉ‘γ¦γ/μ΄λλ°ν
μ (Touhou)
This is the dataset of inaba_tewi/ε εΉ‘γ¦γ/μ΄λλ°ν
μ (Touhou), containing 500 images and their tags.
The core tags of this character are `animal_ears, rabbit_ears, short_hair, red_eyes, black_hair, brown_hair, rabbit_girl, tail, rabbit_tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 492.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inaba_tewi_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 313.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inaba_tewi_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1137 | 637.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inaba_tewi_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 451.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inaba_tewi_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1137 | 844.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/inaba_tewi_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/inaba_tewi_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, carrot_necklace, pendant, pink_dress, solo, open_mouth, short_sleeves, looking_at_viewer, simple_background, smile, puffy_sleeves, white_background |
| 1 | 25 |  |  |  |  |  | 1girl, carrot_necklace, puffy_short_sleeves, solo, pink_dress, smile, floppy_ears, looking_at_viewer, simple_background, bangs, blush, white_background, closed_mouth, hair_between_eyes, ribbon-trimmed_dress, frills |
| 2 | 13 |  |  |  |  |  | blush, 1girl, bangs, floppy_ears, loli, looking_at_viewer, nipples, completely_nude, flat_chest, solo, pussy, simple_background, full_body, hair_between_eyes, navel, open_mouth, smile, barefoot, white_background, spread_legs, :3, censored, small_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | carrot_necklace | pendant | pink_dress | solo | open_mouth | short_sleeves | looking_at_viewer | simple_background | smile | puffy_sleeves | white_background | puffy_short_sleeves | floppy_ears | bangs | blush | closed_mouth | hair_between_eyes | ribbon-trimmed_dress | frills | loli | nipples | completely_nude | flat_chest | pussy | full_body | navel | barefoot | spread_legs | :3 | censored | small_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:----------|:-------------|:-------|:-------------|:----------------|:--------------------|:--------------------|:--------|:----------------|:-------------------|:----------------------|:--------------|:--------|:--------|:---------------|:--------------------|:-----------------------|:---------|:-------|:----------|:------------------|:-------------|:--------|:------------|:--------|:-----------|:--------------|:-----|:-----------|:----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | | X | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | | | | X | X | | X | X | X | | X | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/inaba_tewi_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T20:11:38+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T13:22:28+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of inaba\_tewi/ε εΉ‘γ¦γ/μ΄λλ°ν
μ (Touhou)
==========================================
This is the dataset of inaba\_tewi/ε εΉ‘γ¦γ/μ΄λλ°ν
μ (Touhou), containing 500 images and their tags.
The core tags of this character are 'animal\_ears, rabbit\_ears, short\_hair, red\_eyes, black\_hair, brown\_hair, rabbit\_girl, tail, rabbit\_tail', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
54575a492a2b84f00f512181669835e49c152173
|
# Dataset Card for "942ab115"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/942ab115
|
[
"region:us"
] |
2023-08-17T20:19:14+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 178, "num_examples": 10}], "download_size": 1314, "dataset_size": 178}}
|
2023-08-17T20:19:15+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "942ab115"
More Information needed
|
[
"# Dataset Card for \"942ab115\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"942ab115\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"942ab115\"\n\nMore Information needed"
] |
c49b97c9154291c9f80bb52183b025af84c56d86
|
# Dataset Card for "azaria-mitchell"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
notrichardren/azaria-mitchell
|
[
"region:us"
] |
2023-08-17T20:22:46+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "combined", "path": "data/combined-*"}, {"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "claim", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "dataset", "dtype": "string"}, {"name": "qa_type", "dtype": "int64"}, {"name": "ind", "dtype": "int64"}], "splits": [{"name": "combined", "num_bytes": 1553103, "num_examples": 17092}, {"name": "train", "num_bytes": 1244045, "num_examples": 13673}, {"name": "test", "num_bytes": 309058, "num_examples": 3419}], "download_size": 1228770, "dataset_size": 3106206}}
|
2023-08-17T20:22:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "azaria-mitchell"
More Information needed
|
[
"# Dataset Card for \"azaria-mitchell\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"azaria-mitchell\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"azaria-mitchell\"\n\nMore Information needed"
] |
5e44eeb655f265cf140a5011aa606548c578ecf8
|
# Dataset Card for "directv-zocalos-5fps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Seenka/directv-zocalos-5fps
|
[
"region:us"
] |
2023-08-17T20:26:09+00:00
|
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "frame_time", "dtype": "time64[us]"}, {"name": "video_storage_path", "dtype": "string"}, {"name": "zocalo_id", "dtype": "string"}, {"name": "frame_number", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3651052944.125, "num_examples": 12965}], "download_size": 3500936724, "dataset_size": 3651052944.125}}
|
2023-08-17T21:14:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "directv-zocalos-5fps"
More Information needed
|
[
"# Dataset Card for \"directv-zocalos-5fps\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"directv-zocalos-5fps\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"directv-zocalos-5fps\"\n\nMore Information needed"
] |
b131dc3fb405fc904882ba04bdcb9b7bceb93ae7
|
# Dataset Card for "AA_GPTNEO_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/AA_GPTNEO_Baseline
|
[
"region:us"
] |
2023-08-17T20:34:55+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "768", "dtype": "float32"}, {"name": "769", "dtype": "float32"}, {"name": "770", "dtype": "float32"}, {"name": "771", "dtype": "float32"}, {"name": "772", "dtype": "float32"}, {"name": "773", "dtype": "float32"}, {"name": "774", "dtype": "float32"}, {"name": "775", "dtype": "float32"}, {"name": "776", "dtype": "float32"}, {"name": "777", "dtype": "float32"}, {"name": "778", "dtype": "float32"}, {"name": "779", "dtype": "float32"}, {"name": "780", "dtype": "float32"}, {"name": "781", "dtype": "float32"}, {"name": "782", "dtype": "float32"}, {"name": "783", "dtype": "float32"}, {"name": "784", "dtype": "float32"}, {"name": "785", "dtype": "float32"}, {"name": "786", "dtype": "float32"}, {"name": "787", "dtype": "float32"}, {"name": "788", "dtype": "float32"}, {"name": "789", "dtype": "float32"}, {"name": "790", "dtype": "float32"}, {"name": "791", "dtype": "float32"}, {"name": "792", "dtype": "float32"}, {"name": "793", "dtype": "float32"}, {"name": "794", "dtype": "float32"}, {"name": "795", "dtype": "float32"}, {"name": "796", "dtype": "float32"}, {"name": "797", "dtype": "float32"}, {"name": "798", "dtype": "float32"}, {"name": "799", "dtype": "float32"}, {"name": "800", "dtype": "float32"}, {"name": "801", "dtype": "float32"}, {"name": "802", "dtype": "float32"}, {"name": "803", "dtype": "float32"}, {"name": "804", "dtype": "float32"}, {"name": "805", "dtype": "float32"}, {"name": "806", "dtype": "float32"}, {"name": "807", "dtype": "float32"}, {"name": "808", "dtype": "float32"}, {"name": "809", "dtype": "float32"}, {"name": "810", "dtype": "float32"}, {"name": "811", "dtype": "float32"}, {"name": "812", "dtype": "float32"}, {"name": "813", "dtype": "float32"}, {"name": "814", "dtype": "float32"}, {"name": "815", "dtype": "float32"}, {"name": "816", "dtype": "float32"}, {"name": "817", "dtype": "float32"}, {"name": "818", "dtype": "float32"}, {"name": "819", "dtype": "float32"}, {"name": "820", "dtype": "float32"}, {"name": "821", "dtype": "float32"}, {"name": "822", "dtype": "float32"}, {"name": "823", "dtype": "float32"}, {"name": "824", "dtype": "float32"}, {"name": "825", "dtype": "float32"}, {"name": "826", "dtype": "float32"}, {"name": "827", "dtype": "float32"}, {"name": "828", "dtype": "float32"}, {"name": "829", "dtype": "float32"}, {"name": "830", "dtype": "float32"}, {"name": "831", "dtype": "float32"}, {"name": "832", "dtype": "float32"}, {"name": "833", "dtype": "float32"}, {"name": "834", "dtype": "float32"}, {"name": "835", "dtype": "float32"}, {"name": "836", "dtype": "float32"}, {"name": "837", "dtype": "float32"}, {"name": "838", "dtype": "float32"}, {"name": "839", "dtype": "float32"}, {"name": "840", "dtype": "float32"}, {"name": "841", "dtype": "float32"}, {"name": "842", "dtype": "float32"}, {"name": "843", "dtype": "float32"}, {"name": "844", "dtype": "float32"}, {"name": "845", "dtype": "float32"}, {"name": "846", "dtype": "float32"}, {"name": "847", "dtype": "float32"}, {"name": "848", "dtype": "float32"}, {"name": "849", "dtype": "float32"}, {"name": "850", "dtype": "float32"}, {"name": "851", "dtype": "float32"}, {"name": "852", "dtype": "float32"}, {"name": "853", "dtype": "float32"}, {"name": "854", "dtype": "float32"}, {"name": "855", "dtype": "float32"}, {"name": "856", "dtype": "float32"}, {"name": "857", "dtype": "float32"}, {"name": "858", "dtype": "float32"}, {"name": "859", "dtype": "float32"}, {"name": "860", "dtype": "float32"}, {"name": "861", "dtype": "float32"}, {"name": "862", "dtype": "float32"}, {"name": "863", "dtype": "float32"}, {"name": "864", "dtype": "float32"}, {"name": "865", "dtype": "float32"}, {"name": "866", "dtype": "float32"}, {"name": "867", "dtype": "float32"}, {"name": "868", "dtype": "float32"}, {"name": "869", "dtype": "float32"}, {"name": "870", "dtype": "float32"}, {"name": "871", "dtype": "float32"}, {"name": "872", "dtype": "float32"}, {"name": "873", "dtype": "float32"}, {"name": "874", "dtype": "float32"}, {"name": "875", "dtype": "float32"}, {"name": "876", "dtype": "float32"}, {"name": "877", "dtype": "float32"}, {"name": "878", "dtype": "float32"}, {"name": "879", "dtype": "float32"}, {"name": "880", "dtype": "float32"}, {"name": "881", "dtype": "float32"}, {"name": "882", "dtype": "float32"}, {"name": "883", "dtype": "float32"}, {"name": "884", "dtype": "float32"}, {"name": "885", "dtype": "float32"}, {"name": "886", "dtype": "float32"}, {"name": "887", "dtype": "float32"}, {"name": "888", "dtype": "float32"}, {"name": "889", "dtype": "float32"}, {"name": "890", "dtype": "float32"}, {"name": "891", "dtype": "float32"}, {"name": "892", "dtype": "float32"}, {"name": "893", "dtype": "float32"}, {"name": "894", "dtype": "float32"}, {"name": "895", "dtype": "float32"}, {"name": "896", "dtype": "float32"}, {"name": "897", "dtype": "float32"}, {"name": "898", "dtype": "float32"}, {"name": "899", "dtype": "float32"}, {"name": "900", "dtype": "float32"}, {"name": "901", "dtype": "float32"}, {"name": "902", "dtype": "float32"}, {"name": "903", "dtype": "float32"}, {"name": "904", "dtype": "float32"}, {"name": "905", "dtype": "float32"}, {"name": "906", "dtype": "float32"}, {"name": "907", "dtype": "float32"}, {"name": "908", "dtype": "float32"}, {"name": "909", "dtype": "float32"}, {"name": "910", "dtype": "float32"}, {"name": "911", "dtype": "float32"}, {"name": "912", "dtype": "float32"}, {"name": "913", "dtype": "float32"}, {"name": "914", "dtype": "float32"}, {"name": "915", "dtype": "float32"}, {"name": "916", "dtype": "float32"}, {"name": "917", "dtype": "float32"}, {"name": "918", "dtype": "float32"}, {"name": "919", "dtype": "float32"}, {"name": "920", "dtype": "float32"}, {"name": "921", "dtype": "float32"}, {"name": "922", "dtype": "float32"}, {"name": "923", "dtype": "float32"}, {"name": "924", "dtype": "float32"}, {"name": "925", "dtype": "float32"}, {"name": "926", "dtype": "float32"}, {"name": "927", "dtype": "float32"}, {"name": "928", "dtype": "float32"}, {"name": "929", "dtype": "float32"}, {"name": "930", "dtype": "float32"}, {"name": "931", "dtype": "float32"}, {"name": "932", "dtype": "float32"}, {"name": "933", "dtype": "float32"}, {"name": "934", "dtype": "float32"}, {"name": "935", "dtype": "float32"}, {"name": "936", "dtype": "float32"}, {"name": "937", "dtype": "float32"}, {"name": "938", "dtype": "float32"}, {"name": "939", "dtype": "float32"}, {"name": "940", "dtype": "float32"}, {"name": "941", "dtype": "float32"}, {"name": "942", "dtype": "float32"}, {"name": "943", "dtype": "float32"}, {"name": "944", "dtype": "float32"}, {"name": "945", "dtype": "float32"}, {"name": "946", "dtype": "float32"}, {"name": "947", "dtype": "float32"}, {"name": "948", "dtype": "float32"}, {"name": "949", "dtype": "float32"}, {"name": "950", "dtype": "float32"}, {"name": "951", "dtype": "float32"}, {"name": "952", "dtype": "float32"}, {"name": "953", "dtype": "float32"}, {"name": "954", "dtype": "float32"}, {"name": "955", "dtype": "float32"}, {"name": "956", "dtype": "float32"}, {"name": "957", "dtype": "float32"}, {"name": "958", "dtype": "float32"}, {"name": "959", "dtype": "float32"}, {"name": "960", "dtype": "float32"}, {"name": "961", "dtype": "float32"}, {"name": "962", "dtype": "float32"}, {"name": "963", "dtype": "float32"}, {"name": "964", "dtype": "float32"}, {"name": "965", "dtype": "float32"}, {"name": "966", "dtype": "float32"}, {"name": "967", "dtype": "float32"}, {"name": "968", "dtype": "float32"}, {"name": "969", "dtype": "float32"}, {"name": "970", "dtype": "float32"}, {"name": "971", "dtype": "float32"}, {"name": "972", "dtype": "float32"}, {"name": "973", "dtype": "float32"}, {"name": "974", "dtype": "float32"}, {"name": "975", "dtype": "float32"}, {"name": "976", "dtype": "float32"}, {"name": "977", "dtype": "float32"}, {"name": "978", "dtype": "float32"}, {"name": "979", "dtype": "float32"}, {"name": "980", "dtype": "float32"}, {"name": "981", "dtype": "float32"}, {"name": "982", "dtype": "float32"}, {"name": "983", "dtype": "float32"}, {"name": "984", "dtype": "float32"}, {"name": "985", "dtype": "float32"}, {"name": "986", "dtype": "float32"}, {"name": "987", "dtype": "float32"}, {"name": "988", "dtype": "float32"}, {"name": "989", "dtype": "float32"}, {"name": "990", "dtype": "float32"}, {"name": "991", "dtype": "float32"}, {"name": "992", "dtype": "float32"}, {"name": "993", "dtype": "float32"}, {"name": "994", "dtype": "float32"}, {"name": "995", "dtype": "float32"}, {"name": "996", "dtype": "float32"}, {"name": "997", "dtype": "float32"}, {"name": "998", "dtype": "float32"}, {"name": "999", "dtype": "float32"}, {"name": "1000", "dtype": "float32"}, {"name": "1001", "dtype": "float32"}, {"name": "1002", "dtype": "float32"}, {"name": "1003", "dtype": "float32"}, {"name": "1004", "dtype": "float32"}, {"name": "1005", "dtype": "float32"}, {"name": "1006", "dtype": "float32"}, {"name": "1007", "dtype": "float32"}, {"name": "1008", "dtype": "float32"}, {"name": "1009", "dtype": "float32"}, {"name": "1010", "dtype": "float32"}, {"name": "1011", "dtype": "float32"}, {"name": "1012", "dtype": "float32"}, {"name": "1013", "dtype": "float32"}, {"name": "1014", "dtype": "float32"}, {"name": "1015", "dtype": "float32"}, {"name": "1016", "dtype": "float32"}, {"name": "1017", "dtype": "float32"}, {"name": "1018", "dtype": "float32"}, {"name": "1019", "dtype": "float32"}, {"name": "1020", "dtype": "float32"}, {"name": "1021", "dtype": "float32"}, {"name": "1022", "dtype": "float32"}, {"name": "1023", "dtype": "float32"}, {"name": "1024", "dtype": "float32"}, {"name": "1025", "dtype": "float32"}, {"name": "1026", "dtype": "float32"}, {"name": "1027", "dtype": "float32"}, {"name": "1028", "dtype": "float32"}, {"name": "1029", "dtype": "float32"}, {"name": "1030", "dtype": "float32"}, {"name": "1031", "dtype": "float32"}, {"name": "1032", "dtype": "float32"}, {"name": "1033", "dtype": "float32"}, {"name": "1034", "dtype": "float32"}, {"name": "1035", "dtype": "float32"}, {"name": "1036", "dtype": "float32"}, {"name": "1037", "dtype": "float32"}, {"name": "1038", "dtype": "float32"}, {"name": "1039", "dtype": "float32"}, {"name": "1040", "dtype": "float32"}, {"name": "1041", "dtype": "float32"}, {"name": "1042", "dtype": "float32"}, {"name": "1043", "dtype": "float32"}, {"name": "1044", "dtype": "float32"}, {"name": "1045", "dtype": "float32"}, {"name": "1046", "dtype": "float32"}, {"name": "1047", "dtype": "float32"}, {"name": "1048", "dtype": "float32"}, {"name": "1049", "dtype": "float32"}, {"name": "1050", "dtype": "float32"}, {"name": "1051", "dtype": "float32"}, {"name": "1052", "dtype": "float32"}, {"name": "1053", "dtype": "float32"}, {"name": "1054", "dtype": "float32"}, {"name": "1055", "dtype": "float32"}, {"name": "1056", "dtype": "float32"}, {"name": "1057", "dtype": "float32"}, {"name": "1058", "dtype": "float32"}, {"name": "1059", "dtype": "float32"}, {"name": "1060", "dtype": "float32"}, {"name": "1061", "dtype": "float32"}, {"name": "1062", "dtype": "float32"}, {"name": "1063", "dtype": "float32"}, {"name": "1064", "dtype": "float32"}, {"name": "1065", "dtype": "float32"}, {"name": "1066", "dtype": "float32"}, {"name": "1067", "dtype": "float32"}, {"name": "1068", "dtype": "float32"}, {"name": "1069", "dtype": "float32"}, {"name": "1070", "dtype": "float32"}, {"name": "1071", "dtype": "float32"}, {"name": "1072", "dtype": "float32"}, {"name": "1073", "dtype": "float32"}, {"name": "1074", "dtype": "float32"}, {"name": "1075", "dtype": "float32"}, {"name": "1076", "dtype": "float32"}, {"name": "1077", "dtype": "float32"}, {"name": "1078", "dtype": "float32"}, {"name": "1079", "dtype": "float32"}, {"name": "1080", "dtype": "float32"}, {"name": "1081", "dtype": "float32"}, {"name": "1082", "dtype": "float32"}, {"name": "1083", "dtype": "float32"}, {"name": "1084", "dtype": "float32"}, {"name": "1085", "dtype": "float32"}, {"name": "1086", "dtype": "float32"}, {"name": "1087", "dtype": "float32"}, {"name": "1088", "dtype": "float32"}, {"name": "1089", "dtype": "float32"}, {"name": "1090", "dtype": "float32"}, {"name": "1091", "dtype": "float32"}, {"name": "1092", "dtype": "float32"}, {"name": "1093", "dtype": "float32"}, {"name": "1094", "dtype": "float32"}, {"name": "1095", "dtype": "float32"}, {"name": "1096", "dtype": "float32"}, {"name": "1097", "dtype": "float32"}, {"name": "1098", "dtype": "float32"}, {"name": "1099", "dtype": "float32"}, {"name": "1100", "dtype": "float32"}, {"name": "1101", "dtype": "float32"}, {"name": "1102", "dtype": "float32"}, {"name": "1103", "dtype": "float32"}, {"name": "1104", "dtype": "float32"}, {"name": "1105", "dtype": "float32"}, {"name": "1106", "dtype": "float32"}, {"name": "1107", "dtype": "float32"}, {"name": "1108", "dtype": "float32"}, {"name": "1109", "dtype": "float32"}, {"name": "1110", "dtype": "float32"}, {"name": "1111", "dtype": "float32"}, {"name": "1112", "dtype": "float32"}, {"name": "1113", "dtype": "float32"}, {"name": "1114", "dtype": "float32"}, {"name": "1115", "dtype": "float32"}, {"name": "1116", "dtype": "float32"}, {"name": "1117", "dtype": "float32"}, {"name": "1118", "dtype": "float32"}, {"name": "1119", "dtype": "float32"}, {"name": "1120", "dtype": "float32"}, {"name": "1121", "dtype": "float32"}, {"name": "1122", "dtype": "float32"}, {"name": "1123", "dtype": "float32"}, {"name": "1124", "dtype": "float32"}, {"name": "1125", "dtype": "float32"}, {"name": "1126", "dtype": "float32"}, {"name": "1127", "dtype": "float32"}, {"name": "1128", "dtype": "float32"}, {"name": "1129", "dtype": "float32"}, {"name": "1130", "dtype": "float32"}, {"name": "1131", "dtype": "float32"}, {"name": "1132", "dtype": "float32"}, {"name": "1133", "dtype": "float32"}, {"name": "1134", "dtype": "float32"}, {"name": "1135", "dtype": "float32"}, {"name": "1136", "dtype": "float32"}, {"name": "1137", "dtype": "float32"}, {"name": "1138", "dtype": "float32"}, {"name": "1139", "dtype": "float32"}, {"name": "1140", "dtype": "float32"}, {"name": "1141", "dtype": "float32"}, {"name": "1142", "dtype": "float32"}, {"name": "1143", "dtype": "float32"}, {"name": "1144", "dtype": "float32"}, {"name": "1145", "dtype": "float32"}, {"name": "1146", "dtype": "float32"}, {"name": "1147", "dtype": "float32"}, {"name": "1148", "dtype": "float32"}, {"name": "1149", "dtype": "float32"}, {"name": "1150", "dtype": "float32"}, {"name": "1151", "dtype": "float32"}, {"name": "1152", "dtype": "float32"}, {"name": "1153", "dtype": "float32"}, {"name": "1154", "dtype": "float32"}, {"name": "1155", "dtype": "float32"}, {"name": "1156", "dtype": "float32"}, {"name": "1157", "dtype": "float32"}, {"name": "1158", "dtype": "float32"}, {"name": "1159", "dtype": "float32"}, {"name": "1160", "dtype": "float32"}, {"name": "1161", "dtype": "float32"}, {"name": "1162", "dtype": "float32"}, {"name": "1163", "dtype": "float32"}, {"name": "1164", "dtype": "float32"}, {"name": "1165", "dtype": "float32"}, {"name": "1166", "dtype": "float32"}, {"name": "1167", "dtype": "float32"}, {"name": "1168", "dtype": "float32"}, {"name": "1169", "dtype": "float32"}, {"name": "1170", "dtype": "float32"}, {"name": "1171", "dtype": "float32"}, {"name": "1172", "dtype": "float32"}, {"name": "1173", "dtype": "float32"}, {"name": "1174", "dtype": "float32"}, {"name": "1175", "dtype": "float32"}, {"name": "1176", "dtype": "float32"}, {"name": "1177", "dtype": "float32"}, {"name": "1178", "dtype": "float32"}, {"name": "1179", "dtype": "float32"}, {"name": "1180", "dtype": "float32"}, {"name": "1181", "dtype": "float32"}, {"name": "1182", "dtype": "float32"}, {"name": "1183", "dtype": "float32"}, {"name": "1184", "dtype": "float32"}, {"name": "1185", "dtype": "float32"}, {"name": "1186", "dtype": "float32"}, {"name": "1187", "dtype": "float32"}, {"name": "1188", "dtype": "float32"}, {"name": "1189", "dtype": "float32"}, {"name": "1190", "dtype": "float32"}, {"name": "1191", "dtype": "float32"}, {"name": "1192", "dtype": "float32"}, {"name": "1193", "dtype": "float32"}, {"name": "1194", "dtype": "float32"}, {"name": "1195", "dtype": "float32"}, {"name": "1196", "dtype": "float32"}, {"name": "1197", "dtype": "float32"}, {"name": "1198", "dtype": "float32"}, {"name": "1199", "dtype": "float32"}, {"name": "1200", "dtype": "float32"}, {"name": "1201", "dtype": "float32"}, {"name": "1202", "dtype": "float32"}, {"name": "1203", "dtype": "float32"}, {"name": "1204", "dtype": "float32"}, {"name": "1205", "dtype": "float32"}, {"name": "1206", "dtype": "float32"}, {"name": "1207", "dtype": "float32"}, {"name": "1208", "dtype": "float32"}, {"name": "1209", "dtype": "float32"}, {"name": "1210", "dtype": "float32"}, {"name": "1211", "dtype": "float32"}, {"name": "1212", "dtype": "float32"}, {"name": "1213", "dtype": "float32"}, {"name": "1214", "dtype": "float32"}, {"name": "1215", "dtype": "float32"}, {"name": "1216", "dtype": "float32"}, {"name": "1217", "dtype": "float32"}, {"name": "1218", "dtype": "float32"}, {"name": "1219", "dtype": "float32"}, {"name": "1220", "dtype": "float32"}, {"name": "1221", "dtype": "float32"}, {"name": "1222", "dtype": "float32"}, {"name": "1223", "dtype": "float32"}, {"name": "1224", "dtype": "float32"}, {"name": "1225", "dtype": "float32"}, {"name": "1226", "dtype": "float32"}, {"name": "1227", "dtype": "float32"}, {"name": "1228", "dtype": "float32"}, {"name": "1229", "dtype": "float32"}, {"name": "1230", "dtype": "float32"}, {"name": "1231", "dtype": "float32"}, {"name": "1232", "dtype": "float32"}, {"name": "1233", "dtype": "float32"}, {"name": "1234", "dtype": "float32"}, {"name": "1235", "dtype": "float32"}, {"name": "1236", "dtype": "float32"}, {"name": "1237", "dtype": "float32"}, {"name": "1238", "dtype": "float32"}, {"name": "1239", "dtype": "float32"}, {"name": "1240", "dtype": "float32"}, {"name": "1241", "dtype": "float32"}, {"name": "1242", "dtype": "float32"}, {"name": "1243", "dtype": "float32"}, {"name": "1244", "dtype": "float32"}, {"name": "1245", "dtype": "float32"}, {"name": "1246", "dtype": "float32"}, {"name": "1247", "dtype": "float32"}, {"name": "1248", "dtype": "float32"}, {"name": "1249", "dtype": "float32"}, {"name": "1250", "dtype": "float32"}, {"name": "1251", "dtype": "float32"}, {"name": "1252", "dtype": "float32"}, {"name": "1253", "dtype": "float32"}, {"name": "1254", "dtype": "float32"}, {"name": "1255", "dtype": "float32"}, {"name": "1256", "dtype": "float32"}, {"name": "1257", "dtype": "float32"}, {"name": "1258", "dtype": "float32"}, {"name": "1259", "dtype": "float32"}, {"name": "1260", "dtype": "float32"}, {"name": "1261", "dtype": "float32"}, {"name": "1262", "dtype": "float32"}, {"name": "1263", "dtype": "float32"}, {"name": "1264", "dtype": "float32"}, {"name": "1265", "dtype": "float32"}, {"name": "1266", "dtype": "float32"}, {"name": "1267", "dtype": "float32"}, {"name": "1268", "dtype": "float32"}, {"name": "1269", "dtype": "float32"}, {"name": "1270", "dtype": "float32"}, {"name": "1271", "dtype": "float32"}, {"name": "1272", "dtype": "float32"}, {"name": "1273", "dtype": "float32"}, {"name": "1274", "dtype": "float32"}, {"name": "1275", "dtype": "float32"}, {"name": "1276", "dtype": "float32"}, {"name": "1277", "dtype": "float32"}, {"name": "1278", "dtype": "float32"}, {"name": "1279", "dtype": "float32"}, {"name": "1280", "dtype": "float32"}, {"name": "1281", "dtype": "float32"}, {"name": "1282", "dtype": "float32"}, {"name": "1283", "dtype": "float32"}, {"name": "1284", "dtype": "float32"}, {"name": "1285", "dtype": "float32"}, {"name": "1286", "dtype": "float32"}, {"name": "1287", "dtype": "float32"}, {"name": "1288", "dtype": "float32"}, {"name": "1289", "dtype": "float32"}, {"name": "1290", "dtype": "float32"}, {"name": "1291", "dtype": "float32"}, {"name": "1292", "dtype": "float32"}, {"name": "1293", "dtype": "float32"}, {"name": "1294", "dtype": "float32"}, {"name": "1295", "dtype": "float32"}, {"name": "1296", "dtype": "float32"}, {"name": "1297", "dtype": "float32"}, {"name": "1298", "dtype": "float32"}, {"name": "1299", "dtype": "float32"}, {"name": "1300", "dtype": "float32"}, {"name": "1301", "dtype": "float32"}, {"name": "1302", "dtype": "float32"}, {"name": "1303", "dtype": "float32"}, {"name": "1304", "dtype": "float32"}, {"name": "1305", "dtype": "float32"}, {"name": "1306", "dtype": "float32"}, {"name": "1307", "dtype": "float32"}, {"name": "1308", "dtype": "float32"}, {"name": "1309", "dtype": "float32"}, {"name": "1310", "dtype": "float32"}, {"name": "1311", "dtype": "float32"}, {"name": "1312", "dtype": "float32"}, {"name": "1313", "dtype": "float32"}, {"name": "1314", "dtype": "float32"}, {"name": "1315", "dtype": "float32"}, {"name": "1316", "dtype": "float32"}, {"name": "1317", "dtype": "float32"}, {"name": "1318", "dtype": "float32"}, {"name": "1319", "dtype": "float32"}, {"name": "1320", "dtype": "float32"}, {"name": "1321", "dtype": "float32"}, {"name": "1322", "dtype": "float32"}, {"name": "1323", "dtype": "float32"}, {"name": "1324", "dtype": "float32"}, {"name": "1325", "dtype": "float32"}, {"name": "1326", "dtype": "float32"}, {"name": "1327", "dtype": "float32"}, {"name": "1328", "dtype": "float32"}, {"name": "1329", "dtype": "float32"}, {"name": "1330", "dtype": "float32"}, {"name": "1331", "dtype": "float32"}, {"name": "1332", "dtype": "float32"}, {"name": "1333", "dtype": "float32"}, {"name": "1334", "dtype": "float32"}, {"name": "1335", "dtype": "float32"}, {"name": "1336", "dtype": "float32"}, {"name": "1337", "dtype": "float32"}, {"name": "1338", "dtype": "float32"}, {"name": "1339", "dtype": "float32"}, {"name": "1340", "dtype": "float32"}, {"name": "1341", "dtype": "float32"}, {"name": "1342", "dtype": "float32"}, {"name": "1343", "dtype": "float32"}, {"name": "1344", "dtype": "float32"}, {"name": "1345", "dtype": "float32"}, {"name": "1346", "dtype": "float32"}, {"name": "1347", "dtype": "float32"}, {"name": "1348", "dtype": "float32"}, {"name": "1349", "dtype": "float32"}, {"name": "1350", "dtype": "float32"}, {"name": "1351", "dtype": "float32"}, {"name": "1352", "dtype": "float32"}, {"name": "1353", "dtype": "float32"}, {"name": "1354", "dtype": "float32"}, {"name": "1355", "dtype": "float32"}, {"name": "1356", "dtype": "float32"}, {"name": "1357", "dtype": "float32"}, {"name": "1358", "dtype": "float32"}, {"name": "1359", "dtype": "float32"}, {"name": "1360", "dtype": "float32"}, {"name": "1361", "dtype": "float32"}, {"name": "1362", "dtype": "float32"}, {"name": "1363", "dtype": "float32"}, {"name": "1364", "dtype": "float32"}, {"name": "1365", "dtype": "float32"}, {"name": "1366", "dtype": "float32"}, {"name": "1367", "dtype": "float32"}, {"name": "1368", "dtype": "float32"}, {"name": "1369", "dtype": "float32"}, {"name": "1370", "dtype": "float32"}, {"name": "1371", "dtype": "float32"}, {"name": "1372", "dtype": "float32"}, {"name": "1373", "dtype": "float32"}, {"name": "1374", "dtype": "float32"}, {"name": "1375", "dtype": "float32"}, {"name": "1376", "dtype": "float32"}, {"name": "1377", "dtype": "float32"}, {"name": "1378", "dtype": "float32"}, {"name": "1379", "dtype": "float32"}, {"name": "1380", "dtype": "float32"}, {"name": "1381", "dtype": "float32"}, {"name": "1382", "dtype": "float32"}, {"name": "1383", "dtype": "float32"}, {"name": "1384", "dtype": "float32"}, {"name": "1385", "dtype": "float32"}, {"name": "1386", "dtype": "float32"}, {"name": "1387", "dtype": "float32"}, {"name": "1388", "dtype": "float32"}, {"name": "1389", "dtype": "float32"}, {"name": "1390", "dtype": "float32"}, {"name": "1391", "dtype": "float32"}, {"name": "1392", "dtype": "float32"}, {"name": "1393", "dtype": "float32"}, {"name": "1394", "dtype": "float32"}, {"name": "1395", "dtype": "float32"}, {"name": "1396", "dtype": "float32"}, {"name": "1397", "dtype": "float32"}, {"name": "1398", "dtype": "float32"}, {"name": "1399", "dtype": "float32"}, {"name": "1400", "dtype": "float32"}, {"name": "1401", "dtype": "float32"}, {"name": "1402", "dtype": "float32"}, {"name": "1403", "dtype": "float32"}, {"name": "1404", "dtype": "float32"}, {"name": "1405", "dtype": "float32"}, {"name": "1406", "dtype": "float32"}, {"name": "1407", "dtype": "float32"}, {"name": "1408", "dtype": "float32"}, {"name": "1409", "dtype": "float32"}, {"name": "1410", "dtype": "float32"}, {"name": "1411", "dtype": "float32"}, {"name": "1412", "dtype": "float32"}, {"name": "1413", "dtype": "float32"}, {"name": "1414", "dtype": "float32"}, {"name": "1415", "dtype": "float32"}, {"name": "1416", "dtype": "float32"}, {"name": "1417", "dtype": "float32"}, {"name": "1418", "dtype": "float32"}, {"name": "1419", "dtype": "float32"}, {"name": "1420", "dtype": "float32"}, {"name": "1421", "dtype": "float32"}, {"name": "1422", "dtype": "float32"}, {"name": "1423", "dtype": "float32"}, {"name": "1424", "dtype": "float32"}, {"name": "1425", "dtype": "float32"}, {"name": "1426", "dtype": "float32"}, {"name": "1427", "dtype": "float32"}, {"name": "1428", "dtype": "float32"}, {"name": "1429", "dtype": "float32"}, {"name": "1430", "dtype": "float32"}, {"name": "1431", "dtype": "float32"}, {"name": "1432", "dtype": "float32"}, {"name": "1433", "dtype": "float32"}, {"name": "1434", "dtype": "float32"}, {"name": "1435", "dtype": "float32"}, {"name": "1436", "dtype": "float32"}, {"name": "1437", "dtype": "float32"}, {"name": "1438", "dtype": "float32"}, {"name": "1439", "dtype": "float32"}, {"name": "1440", "dtype": "float32"}, {"name": "1441", "dtype": "float32"}, {"name": "1442", "dtype": "float32"}, {"name": "1443", "dtype": "float32"}, {"name": "1444", "dtype": "float32"}, {"name": "1445", "dtype": "float32"}, {"name": "1446", "dtype": "float32"}, {"name": "1447", "dtype": "float32"}, {"name": "1448", "dtype": "float32"}, {"name": "1449", "dtype": "float32"}, {"name": "1450", "dtype": "float32"}, {"name": "1451", "dtype": "float32"}, {"name": "1452", "dtype": "float32"}, {"name": "1453", "dtype": "float32"}, {"name": "1454", "dtype": "float32"}, {"name": "1455", "dtype": "float32"}, {"name": "1456", "dtype": "float32"}, {"name": "1457", "dtype": "float32"}, {"name": "1458", "dtype": "float32"}, {"name": "1459", "dtype": "float32"}, {"name": "1460", "dtype": "float32"}, {"name": "1461", "dtype": "float32"}, {"name": "1462", "dtype": "float32"}, {"name": "1463", "dtype": "float32"}, {"name": "1464", "dtype": "float32"}, {"name": "1465", "dtype": "float32"}, {"name": "1466", "dtype": "float32"}, {"name": "1467", "dtype": "float32"}, {"name": "1468", "dtype": "float32"}, {"name": "1469", "dtype": "float32"}, {"name": "1470", "dtype": "float32"}, {"name": "1471", "dtype": "float32"}, {"name": "1472", "dtype": "float32"}, {"name": "1473", "dtype": "float32"}, {"name": "1474", "dtype": "float32"}, {"name": "1475", "dtype": "float32"}, {"name": "1476", "dtype": "float32"}, {"name": "1477", "dtype": "float32"}, {"name": "1478", "dtype": "float32"}, {"name": "1479", "dtype": "float32"}, {"name": "1480", "dtype": "float32"}, {"name": "1481", "dtype": "float32"}, {"name": "1482", "dtype": "float32"}, {"name": "1483", "dtype": "float32"}, {"name": "1484", "dtype": "float32"}, {"name": "1485", "dtype": "float32"}, {"name": "1486", "dtype": "float32"}, {"name": "1487", "dtype": "float32"}, {"name": "1488", "dtype": "float32"}, {"name": "1489", "dtype": "float32"}, {"name": "1490", "dtype": "float32"}, {"name": "1491", "dtype": "float32"}, {"name": "1492", "dtype": "float32"}, {"name": "1493", "dtype": "float32"}, {"name": "1494", "dtype": "float32"}, {"name": "1495", "dtype": "float32"}, {"name": "1496", "dtype": "float32"}, {"name": "1497", "dtype": "float32"}, {"name": "1498", "dtype": "float32"}, {"name": "1499", "dtype": "float32"}, {"name": "1500", "dtype": "float32"}, {"name": "1501", "dtype": "float32"}, {"name": "1502", "dtype": "float32"}, {"name": "1503", "dtype": "float32"}, {"name": "1504", "dtype": "float32"}, {"name": "1505", "dtype": "float32"}, {"name": "1506", "dtype": "float32"}, {"name": "1507", "dtype": "float32"}, {"name": "1508", "dtype": "float32"}, {"name": "1509", "dtype": "float32"}, {"name": "1510", "dtype": "float32"}, {"name": "1511", "dtype": "float32"}, {"name": "1512", "dtype": "float32"}, {"name": "1513", "dtype": "float32"}, {"name": "1514", "dtype": "float32"}, {"name": "1515", "dtype": "float32"}, {"name": "1516", "dtype": "float32"}, {"name": "1517", "dtype": "float32"}, {"name": "1518", "dtype": "float32"}, {"name": "1519", "dtype": "float32"}, {"name": "1520", "dtype": "float32"}, {"name": "1521", "dtype": "float32"}, {"name": "1522", "dtype": "float32"}, {"name": "1523", "dtype": "float32"}, {"name": "1524", "dtype": "float32"}, {"name": "1525", "dtype": "float32"}, {"name": "1526", "dtype": "float32"}, {"name": "1527", "dtype": "float32"}, {"name": "1528", "dtype": "float32"}, {"name": "1529", "dtype": "float32"}, {"name": "1530", "dtype": "float32"}, {"name": "1531", "dtype": "float32"}, {"name": "1532", "dtype": "float32"}, {"name": "1533", "dtype": "float32"}, {"name": "1534", "dtype": "float32"}, {"name": "1535", "dtype": "float32"}, {"name": "1536", "dtype": "float32"}, {"name": "1537", "dtype": "float32"}, {"name": "1538", "dtype": "float32"}, {"name": "1539", "dtype": "float32"}, {"name": "1540", "dtype": "float32"}, {"name": "1541", "dtype": "float32"}, {"name": "1542", "dtype": "float32"}, {"name": "1543", "dtype": "float32"}, {"name": "1544", "dtype": "float32"}, {"name": "1545", "dtype": "float32"}, {"name": "1546", "dtype": "float32"}, {"name": "1547", "dtype": "float32"}, {"name": "1548", "dtype": "float32"}, {"name": "1549", "dtype": "float32"}, {"name": "1550", "dtype": "float32"}, {"name": "1551", "dtype": "float32"}, {"name": "1552", "dtype": "float32"}, {"name": "1553", "dtype": "float32"}, {"name": "1554", "dtype": "float32"}, {"name": "1555", "dtype": "float32"}, {"name": "1556", "dtype": "float32"}, {"name": "1557", "dtype": "float32"}, {"name": "1558", "dtype": "float32"}, {"name": "1559", "dtype": "float32"}, {"name": "1560", "dtype": "float32"}, {"name": "1561", "dtype": "float32"}, {"name": "1562", "dtype": "float32"}, {"name": "1563", "dtype": "float32"}, {"name": "1564", "dtype": "float32"}, {"name": "1565", "dtype": "float32"}, {"name": "1566", "dtype": "float32"}, {"name": "1567", "dtype": "float32"}, {"name": "1568", "dtype": "float32"}, {"name": "1569", "dtype": "float32"}, {"name": "1570", "dtype": "float32"}, {"name": "1571", "dtype": "float32"}, {"name": "1572", "dtype": "float32"}, {"name": "1573", "dtype": "float32"}, {"name": "1574", "dtype": "float32"}, {"name": "1575", "dtype": "float32"}, {"name": "1576", "dtype": "float32"}, {"name": "1577", "dtype": "float32"}, {"name": "1578", "dtype": "float32"}, {"name": "1579", "dtype": "float32"}, {"name": "1580", "dtype": "float32"}, {"name": "1581", "dtype": "float32"}, {"name": "1582", "dtype": "float32"}, {"name": "1583", "dtype": "float32"}, {"name": "1584", "dtype": "float32"}, {"name": "1585", "dtype": "float32"}, {"name": "1586", "dtype": "float32"}, {"name": "1587", "dtype": "float32"}, {"name": "1588", "dtype": "float32"}, {"name": "1589", "dtype": "float32"}, {"name": "1590", "dtype": "float32"}, {"name": "1591", "dtype": "float32"}, {"name": "1592", "dtype": "float32"}, {"name": "1593", "dtype": "float32"}, {"name": "1594", "dtype": "float32"}, {"name": "1595", "dtype": "float32"}, {"name": "1596", "dtype": "float32"}, {"name": "1597", "dtype": "float32"}, {"name": "1598", "dtype": "float32"}, {"name": "1599", "dtype": "float32"}, {"name": "1600", "dtype": "float32"}, {"name": "1601", "dtype": "float32"}, {"name": "1602", "dtype": "float32"}, {"name": "1603", "dtype": "float32"}, {"name": "1604", "dtype": "float32"}, {"name": "1605", "dtype": "float32"}, {"name": "1606", "dtype": "float32"}, {"name": "1607", "dtype": "float32"}, {"name": "1608", "dtype": "float32"}, {"name": "1609", "dtype": "float32"}, {"name": "1610", "dtype": "float32"}, {"name": "1611", "dtype": "float32"}, {"name": "1612", "dtype": "float32"}, {"name": "1613", "dtype": "float32"}, {"name": "1614", "dtype": "float32"}, {"name": "1615", "dtype": "float32"}, {"name": "1616", "dtype": "float32"}, {"name": "1617", "dtype": "float32"}, {"name": "1618", "dtype": "float32"}, {"name": "1619", "dtype": "float32"}, {"name": "1620", "dtype": "float32"}, {"name": "1621", "dtype": "float32"}, {"name": "1622", "dtype": "float32"}, {"name": "1623", "dtype": "float32"}, {"name": "1624", "dtype": "float32"}, {"name": "1625", "dtype": "float32"}, {"name": "1626", "dtype": "float32"}, {"name": "1627", "dtype": "float32"}, {"name": "1628", "dtype": "float32"}, {"name": "1629", "dtype": "float32"}, {"name": "1630", "dtype": "float32"}, {"name": "1631", "dtype": "float32"}, {"name": "1632", "dtype": "float32"}, {"name": "1633", "dtype": "float32"}, {"name": "1634", "dtype": "float32"}, {"name": "1635", "dtype": "float32"}, {"name": "1636", "dtype": "float32"}, {"name": "1637", "dtype": "float32"}, {"name": "1638", "dtype": "float32"}, {"name": "1639", "dtype": "float32"}, {"name": "1640", "dtype": "float32"}, {"name": "1641", "dtype": "float32"}, {"name": "1642", "dtype": "float32"}, {"name": "1643", "dtype": "float32"}, {"name": "1644", "dtype": "float32"}, {"name": "1645", "dtype": "float32"}, {"name": "1646", "dtype": "float32"}, {"name": "1647", "dtype": "float32"}, {"name": "1648", "dtype": "float32"}, {"name": "1649", "dtype": "float32"}, {"name": "1650", "dtype": "float32"}, {"name": "1651", "dtype": "float32"}, {"name": "1652", "dtype": "float32"}, {"name": "1653", "dtype": "float32"}, {"name": "1654", "dtype": "float32"}, {"name": "1655", "dtype": "float32"}, {"name": "1656", "dtype": "float32"}, {"name": "1657", "dtype": "float32"}, {"name": "1658", "dtype": "float32"}, {"name": "1659", "dtype": "float32"}, {"name": "1660", "dtype": "float32"}, {"name": "1661", "dtype": "float32"}, {"name": "1662", "dtype": "float32"}, {"name": "1663", "dtype": "float32"}, {"name": "1664", "dtype": "float32"}, {"name": "1665", "dtype": "float32"}, {"name": "1666", "dtype": "float32"}, {"name": "1667", "dtype": "float32"}, {"name": "1668", "dtype": "float32"}, {"name": "1669", "dtype": "float32"}, {"name": "1670", "dtype": "float32"}, {"name": "1671", "dtype": "float32"}, {"name": "1672", "dtype": "float32"}, {"name": "1673", "dtype": "float32"}, {"name": "1674", "dtype": "float32"}, {"name": "1675", "dtype": "float32"}, {"name": "1676", "dtype": "float32"}, {"name": "1677", "dtype": "float32"}, {"name": "1678", "dtype": "float32"}, {"name": "1679", "dtype": "float32"}, {"name": "1680", "dtype": "float32"}, {"name": "1681", "dtype": "float32"}, {"name": "1682", "dtype": "float32"}, {"name": "1683", "dtype": "float32"}, {"name": "1684", "dtype": "float32"}, {"name": "1685", "dtype": "float32"}, {"name": "1686", "dtype": "float32"}, {"name": "1687", "dtype": "float32"}, {"name": "1688", "dtype": "float32"}, {"name": "1689", "dtype": "float32"}, {"name": "1690", "dtype": "float32"}, {"name": "1691", "dtype": "float32"}, {"name": "1692", "dtype": "float32"}, {"name": "1693", "dtype": "float32"}, {"name": "1694", "dtype": "float32"}, {"name": "1695", "dtype": "float32"}, {"name": "1696", "dtype": "float32"}, {"name": "1697", "dtype": "float32"}, {"name": "1698", "dtype": "float32"}, {"name": "1699", "dtype": "float32"}, {"name": "1700", "dtype": "float32"}, {"name": "1701", "dtype": "float32"}, {"name": "1702", "dtype": "float32"}, {"name": "1703", "dtype": "float32"}, {"name": "1704", "dtype": "float32"}, {"name": "1705", "dtype": "float32"}, {"name": "1706", "dtype": "float32"}, {"name": "1707", "dtype": "float32"}, {"name": "1708", "dtype": "float32"}, {"name": "1709", "dtype": "float32"}, {"name": "1710", "dtype": "float32"}, {"name": "1711", "dtype": "float32"}, {"name": "1712", "dtype": "float32"}, {"name": "1713", "dtype": "float32"}, {"name": "1714", "dtype": "float32"}, {"name": "1715", "dtype": "float32"}, {"name": "1716", "dtype": "float32"}, {"name": "1717", "dtype": "float32"}, {"name": "1718", "dtype": "float32"}, {"name": "1719", "dtype": "float32"}, {"name": "1720", "dtype": "float32"}, {"name": "1721", "dtype": "float32"}, {"name": "1722", "dtype": "float32"}, {"name": "1723", "dtype": "float32"}, {"name": "1724", "dtype": "float32"}, {"name": "1725", "dtype": "float32"}, {"name": "1726", "dtype": "float32"}, {"name": "1727", "dtype": "float32"}, {"name": "1728", "dtype": "float32"}, {"name": "1729", "dtype": "float32"}, {"name": "1730", "dtype": "float32"}, {"name": "1731", "dtype": "float32"}, {"name": "1732", "dtype": "float32"}, {"name": "1733", "dtype": "float32"}, {"name": "1734", "dtype": "float32"}, {"name": "1735", "dtype": "float32"}, {"name": "1736", "dtype": "float32"}, {"name": "1737", "dtype": "float32"}, {"name": "1738", "dtype": "float32"}, {"name": "1739", "dtype": "float32"}, {"name": "1740", "dtype": "float32"}, {"name": "1741", "dtype": "float32"}, {"name": "1742", "dtype": "float32"}, {"name": "1743", "dtype": "float32"}, {"name": "1744", "dtype": "float32"}, {"name": "1745", "dtype": "float32"}, {"name": "1746", "dtype": "float32"}, {"name": "1747", "dtype": "float32"}, {"name": "1748", "dtype": "float32"}, {"name": "1749", "dtype": "float32"}, {"name": "1750", "dtype": "float32"}, {"name": "1751", "dtype": "float32"}, {"name": "1752", "dtype": "float32"}, {"name": "1753", "dtype": "float32"}, {"name": "1754", "dtype": "float32"}, {"name": "1755", "dtype": "float32"}, {"name": "1756", "dtype": "float32"}, {"name": "1757", "dtype": "float32"}, {"name": "1758", "dtype": "float32"}, {"name": "1759", "dtype": "float32"}, {"name": "1760", "dtype": "float32"}, {"name": "1761", "dtype": "float32"}, {"name": "1762", "dtype": "float32"}, {"name": "1763", "dtype": "float32"}, {"name": "1764", "dtype": "float32"}, {"name": "1765", "dtype": "float32"}, {"name": "1766", "dtype": "float32"}, {"name": "1767", "dtype": "float32"}, {"name": "1768", "dtype": "float32"}, {"name": "1769", "dtype": "float32"}, {"name": "1770", "dtype": "float32"}, {"name": "1771", "dtype": "float32"}, {"name": "1772", "dtype": "float32"}, {"name": "1773", "dtype": "float32"}, {"name": "1774", "dtype": "float32"}, {"name": "1775", "dtype": "float32"}, {"name": "1776", "dtype": "float32"}, {"name": "1777", "dtype": "float32"}, {"name": "1778", "dtype": "float32"}, {"name": "1779", "dtype": "float32"}, {"name": "1780", "dtype": "float32"}, {"name": "1781", "dtype": "float32"}, {"name": "1782", "dtype": "float32"}, {"name": "1783", "dtype": "float32"}, {"name": "1784", "dtype": "float32"}, {"name": "1785", "dtype": "float32"}, {"name": "1786", "dtype": "float32"}, {"name": "1787", "dtype": "float32"}, {"name": "1788", "dtype": "float32"}, {"name": "1789", "dtype": "float32"}, {"name": "1790", "dtype": "float32"}, {"name": "1791", "dtype": "float32"}, {"name": "1792", "dtype": "float32"}, {"name": "1793", "dtype": "float32"}, {"name": "1794", "dtype": "float32"}, {"name": "1795", "dtype": "float32"}, {"name": "1796", "dtype": "float32"}, {"name": "1797", "dtype": "float32"}, {"name": "1798", "dtype": "float32"}, {"name": "1799", "dtype": "float32"}, {"name": "1800", "dtype": "float32"}, {"name": "1801", "dtype": "float32"}, {"name": "1802", "dtype": "float32"}, {"name": "1803", "dtype": "float32"}, {"name": "1804", "dtype": "float32"}, {"name": "1805", "dtype": "float32"}, {"name": "1806", "dtype": "float32"}, {"name": "1807", "dtype": "float32"}, {"name": "1808", "dtype": "float32"}, {"name": "1809", "dtype": "float32"}, {"name": "1810", "dtype": "float32"}, {"name": "1811", "dtype": "float32"}, {"name": "1812", "dtype": "float32"}, {"name": "1813", "dtype": "float32"}, {"name": "1814", "dtype": "float32"}, {"name": "1815", "dtype": "float32"}, {"name": "1816", "dtype": "float32"}, {"name": "1817", "dtype": "float32"}, {"name": "1818", "dtype": "float32"}, {"name": "1819", "dtype": "float32"}, {"name": "1820", "dtype": "float32"}, {"name": "1821", "dtype": "float32"}, {"name": "1822", "dtype": "float32"}, {"name": "1823", "dtype": "float32"}, {"name": "1824", "dtype": "float32"}, {"name": "1825", "dtype": "float32"}, {"name": "1826", "dtype": "float32"}, {"name": "1827", "dtype": "float32"}, {"name": "1828", "dtype": "float32"}, {"name": "1829", "dtype": "float32"}, {"name": "1830", "dtype": "float32"}, {"name": "1831", "dtype": "float32"}, {"name": "1832", "dtype": "float32"}, {"name": "1833", "dtype": "float32"}, {"name": "1834", "dtype": "float32"}, {"name": "1835", "dtype": "float32"}, {"name": "1836", "dtype": "float32"}, {"name": "1837", "dtype": "float32"}, {"name": "1838", "dtype": "float32"}, {"name": "1839", "dtype": "float32"}, {"name": "1840", "dtype": "float32"}, {"name": "1841", "dtype": "float32"}, {"name": "1842", "dtype": "float32"}, {"name": "1843", "dtype": "float32"}, {"name": "1844", "dtype": "float32"}, {"name": "1845", "dtype": "float32"}, {"name": "1846", "dtype": "float32"}, {"name": "1847", "dtype": "float32"}, {"name": "1848", "dtype": "float32"}, {"name": "1849", "dtype": "float32"}, {"name": "1850", "dtype": "float32"}, {"name": "1851", "dtype": "float32"}, {"name": "1852", "dtype": "float32"}, {"name": "1853", "dtype": "float32"}, {"name": "1854", "dtype": "float32"}, {"name": "1855", "dtype": "float32"}, {"name": "1856", "dtype": "float32"}, {"name": "1857", "dtype": "float32"}, {"name": "1858", "dtype": "float32"}, {"name": "1859", "dtype": "float32"}, {"name": "1860", "dtype": "float32"}, {"name": "1861", "dtype": "float32"}, {"name": "1862", "dtype": "float32"}, {"name": "1863", "dtype": "float32"}, {"name": "1864", "dtype": "float32"}, {"name": "1865", "dtype": "float32"}, {"name": "1866", "dtype": "float32"}, {"name": "1867", "dtype": "float32"}, {"name": "1868", "dtype": "float32"}, {"name": "1869", "dtype": "float32"}, {"name": "1870", "dtype": "float32"}, {"name": "1871", "dtype": "float32"}, {"name": "1872", "dtype": "float32"}, {"name": "1873", "dtype": "float32"}, {"name": "1874", "dtype": "float32"}, {"name": "1875", "dtype": "float32"}, {"name": "1876", "dtype": "float32"}, {"name": "1877", "dtype": "float32"}, {"name": "1878", "dtype": "float32"}, {"name": "1879", "dtype": "float32"}, {"name": "1880", "dtype": "float32"}, {"name": "1881", "dtype": "float32"}, {"name": "1882", "dtype": "float32"}, {"name": "1883", "dtype": "float32"}, {"name": "1884", "dtype": "float32"}, {"name": "1885", "dtype": "float32"}, {"name": "1886", "dtype": "float32"}, {"name": "1887", "dtype": "float32"}, {"name": "1888", "dtype": "float32"}, {"name": "1889", "dtype": "float32"}, {"name": "1890", "dtype": "float32"}, {"name": "1891", "dtype": "float32"}, {"name": "1892", "dtype": "float32"}, {"name": "1893", "dtype": "float32"}, {"name": "1894", "dtype": "float32"}, {"name": "1895", "dtype": "float32"}, {"name": "1896", "dtype": "float32"}, {"name": "1897", "dtype": "float32"}, {"name": "1898", "dtype": "float32"}, {"name": "1899", "dtype": "float32"}, {"name": "1900", "dtype": "float32"}, {"name": "1901", "dtype": "float32"}, {"name": "1902", "dtype": "float32"}, {"name": "1903", "dtype": "float32"}, {"name": "1904", "dtype": "float32"}, {"name": "1905", "dtype": "float32"}, {"name": "1906", "dtype": "float32"}, {"name": "1907", "dtype": "float32"}, {"name": "1908", "dtype": "float32"}, {"name": "1909", "dtype": "float32"}, {"name": "1910", "dtype": "float32"}, {"name": "1911", "dtype": "float32"}, {"name": "1912", "dtype": "float32"}, {"name": "1913", "dtype": "float32"}, {"name": "1914", "dtype": "float32"}, {"name": "1915", "dtype": "float32"}, {"name": "1916", "dtype": "float32"}, {"name": "1917", "dtype": "float32"}, {"name": "1918", "dtype": "float32"}, {"name": "1919", "dtype": "float32"}, {"name": "1920", "dtype": "float32"}, {"name": "1921", "dtype": "float32"}, {"name": "1922", "dtype": "float32"}, {"name": "1923", "dtype": "float32"}, {"name": "1924", "dtype": "float32"}, {"name": "1925", "dtype": "float32"}, {"name": "1926", "dtype": "float32"}, {"name": "1927", "dtype": "float32"}, {"name": "1928", "dtype": "float32"}, {"name": "1929", "dtype": "float32"}, {"name": "1930", "dtype": "float32"}, {"name": "1931", "dtype": "float32"}, {"name": "1932", "dtype": "float32"}, {"name": "1933", "dtype": "float32"}, {"name": "1934", "dtype": "float32"}, {"name": "1935", "dtype": "float32"}, {"name": "1936", "dtype": "float32"}, {"name": "1937", "dtype": "float32"}, {"name": "1938", "dtype": "float32"}, {"name": "1939", "dtype": "float32"}, {"name": "1940", "dtype": "float32"}, {"name": "1941", "dtype": "float32"}, {"name": "1942", "dtype": "float32"}, {"name": "1943", "dtype": "float32"}, {"name": "1944", "dtype": "float32"}, {"name": "1945", "dtype": "float32"}, {"name": "1946", "dtype": "float32"}, {"name": "1947", "dtype": "float32"}, {"name": "1948", "dtype": "float32"}, {"name": "1949", "dtype": "float32"}, {"name": "1950", "dtype": "float32"}, {"name": "1951", "dtype": "float32"}, {"name": "1952", "dtype": "float32"}, {"name": "1953", "dtype": "float32"}, {"name": "1954", "dtype": "float32"}, {"name": "1955", "dtype": "float32"}, {"name": "1956", "dtype": "float32"}, {"name": "1957", "dtype": "float32"}, {"name": "1958", "dtype": "float32"}, {"name": "1959", "dtype": "float32"}, {"name": "1960", "dtype": "float32"}, {"name": "1961", "dtype": "float32"}, {"name": "1962", "dtype": "float32"}, {"name": "1963", "dtype": "float32"}, {"name": "1964", "dtype": "float32"}, {"name": "1965", "dtype": "float32"}, {"name": "1966", "dtype": "float32"}, {"name": "1967", "dtype": "float32"}, {"name": "1968", "dtype": "float32"}, {"name": "1969", "dtype": "float32"}, {"name": "1970", "dtype": "float32"}, {"name": "1971", "dtype": "float32"}, {"name": "1972", "dtype": "float32"}, {"name": "1973", "dtype": "float32"}, {"name": "1974", "dtype": "float32"}, {"name": "1975", "dtype": "float32"}, {"name": "1976", "dtype": "float32"}, {"name": "1977", "dtype": "float32"}, {"name": "1978", "dtype": "float32"}, {"name": "1979", "dtype": "float32"}, {"name": "1980", "dtype": "float32"}, {"name": "1981", "dtype": "float32"}, {"name": "1982", "dtype": "float32"}, {"name": "1983", "dtype": "float32"}, {"name": "1984", "dtype": "float32"}, {"name": "1985", "dtype": "float32"}, {"name": "1986", "dtype": "float32"}, {"name": "1987", "dtype": "float32"}, {"name": "1988", "dtype": "float32"}, {"name": "1989", "dtype": "float32"}, {"name": "1990", "dtype": "float32"}, {"name": "1991", "dtype": "float32"}, {"name": "1992", "dtype": "float32"}, {"name": "1993", "dtype": "float32"}, {"name": "1994", "dtype": "float32"}, {"name": "1995", "dtype": "float32"}, {"name": "1996", "dtype": "float32"}, {"name": "1997", "dtype": "float32"}, {"name": "1998", "dtype": "float32"}, {"name": "1999", "dtype": "float32"}, {"name": "2000", "dtype": "float32"}, {"name": "2001", "dtype": "float32"}, {"name": "2002", "dtype": "float32"}, {"name": "2003", "dtype": "float32"}, {"name": "2004", "dtype": "float32"}, {"name": "2005", "dtype": "float32"}, {"name": "2006", "dtype": "float32"}, {"name": "2007", "dtype": "float32"}, {"name": "2008", "dtype": "float32"}, {"name": "2009", "dtype": "float32"}, {"name": "2010", "dtype": "float32"}, {"name": "2011", "dtype": "float32"}, {"name": "2012", "dtype": "float32"}, {"name": "2013", "dtype": "float32"}, {"name": "2014", "dtype": "float32"}, {"name": "2015", "dtype": "float32"}, {"name": "2016", "dtype": "float32"}, {"name": "2017", "dtype": "float32"}, {"name": "2018", "dtype": "float32"}, {"name": "2019", "dtype": "float32"}, {"name": "2020", "dtype": "float32"}, {"name": "2021", "dtype": "float32"}, {"name": "2022", "dtype": "float32"}, {"name": "2023", "dtype": "float32"}, {"name": "2024", "dtype": "float32"}, {"name": "2025", "dtype": "float32"}, {"name": "2026", "dtype": "float32"}, {"name": "2027", "dtype": "float32"}, {"name": "2028", "dtype": "float32"}, {"name": "2029", "dtype": "float32"}, {"name": "2030", "dtype": "float32"}, {"name": "2031", "dtype": "float32"}, {"name": "2032", "dtype": "float32"}, {"name": "2033", "dtype": "float32"}, {"name": "2034", "dtype": "float32"}, {"name": "2035", "dtype": "float32"}, {"name": "2036", "dtype": "float32"}, {"name": "2037", "dtype": "float32"}, {"name": "2038", "dtype": "float32"}, {"name": "2039", "dtype": "float32"}, {"name": "2040", "dtype": "float32"}, {"name": "2041", "dtype": "float32"}, {"name": "2042", "dtype": "float32"}, {"name": "2043", "dtype": "float32"}, {"name": "2044", "dtype": "float32"}, {"name": "2045", "dtype": "float32"}, {"name": "2046", "dtype": "float32"}, {"name": "2047", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 213730684.90167296, "num_examples": 26057}, {"name": "test", "num_bytes": 71246297.07147655, "num_examples": 8686}], "download_size": 386094770, "dataset_size": 284976981.97314954}}
|
2023-08-17T20:45:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AA_GPTNEO_Baseline"
More Information needed
|
[
"# Dataset Card for \"AA_GPTNEO_Baseline\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AA_GPTNEO_Baseline\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AA_GPTNEO_Baseline\"\n\nMore Information needed"
] |
cee516434454d2289943b55e4588df6cc500dcb0
|
# Dataset of kagiyama_hina/ι΅ε±±ι/μΉ΄κΈ°μΌλ§νλ (Touhou)
This is the dataset of kagiyama_hina/ι΅ε±±ι/μΉ΄κΈ°μΌλ§νλ (Touhou), containing 500 images and their tags.
The core tags of this character are `green_hair, ribbon, hair_ribbon, bow, hair_bow, front_ponytail, long_hair, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 801.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagiyama_hina_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 474.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagiyama_hina_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1213 | 968.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagiyama_hina_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 722.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagiyama_hina_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1213 | 1.29 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kagiyama_hina_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kagiyama_hina_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, dress, frills, open_mouth, smile, solo |
| 1 | 10 |  |  |  |  |  | 1girl, frills, red_dress, solo, looking_at_viewer, short_sleeves, arm_ribbon |
| 2 | 5 |  |  |  |  |  | 1girl, bangs, closed_mouth, red_bow, red_ribbon, simple_background, smile, solo, white_background, frilled_ribbon, looking_at_viewer, puffy_short_sleeves, red_dress, upper_body, hair_between_eyes |
| 3 | 5 |  |  |  |  |  | 1girl, cleavage, dress, frills, medium_breasts, solo, smile, looking_at_viewer, blush, boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | dress | frills | open_mouth | smile | solo | red_dress | looking_at_viewer | short_sleeves | arm_ribbon | bangs | closed_mouth | red_bow | red_ribbon | simple_background | white_background | frilled_ribbon | puffy_short_sleeves | upper_body | hair_between_eyes | cleavage | medium_breasts | boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:---------|:-------------|:--------|:-------|:------------|:--------------------|:----------------|:-------------|:--------|:---------------|:----------|:-------------|:--------------------|:-------------------|:-----------------|:----------------------|:-------------|:--------------------|:-----------|:-----------------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | | X | | | X | X | X | X | X | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | | X | X | | X | | | | | | | | | | | | | X | X | X |
|
CyberHarem/kagiyama_hina_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T20:38:45+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T12:44:35+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of kagiyama\_hina/ι΅ε±±ι/μΉ΄κΈ°μΌλ§νλ (Touhou)
=============================================
This is the dataset of kagiyama\_hina/ι΅ε±±ι/μΉ΄κΈ°μΌλ§νλ (Touhou), containing 500 images and their tags.
The core tags of this character are 'green\_hair, ribbon, hair\_ribbon, bow, hair\_bow, front\_ponytail, long\_hair, green\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
ff3c6b8ca360c576775cf944b1f125dcd83a644d
|
# Dataset Card for "AA_BERT_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/AA_BERT_Finetuned
|
[
"region:us"
] |
2023-08-17T20:46:45+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318780.21618997, "num_examples": 26057}, {"name": "test", "num_bytes": 26774087.073587257, "num_examples": 8686}], "download_size": 147067620, "dataset_size": 107092867.28977722}}
|
2023-08-23T02:54:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AA_BERT_Finetuned"
More Information needed
|
[
"# Dataset Card for \"AA_BERT_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AA_BERT_Finetuned\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AA_BERT_Finetuned\"\n\nMore Information needed"
] |
c5b08ce189144ea8415a69727a250e254350f97b
|
# Dataset Card for "AA_RoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/AA_RoBERTa_Finetuned
|
[
"region:us"
] |
2023-08-17T20:51:54+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318780.21618997, "num_examples": 26057}, {"name": "test", "num_bytes": 26774087.073587257, "num_examples": 8686}], "download_size": 147169115, "dataset_size": 107092867.28977722}}
|
2023-08-23T03:01:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AA_RoBERTa_Finetuned"
More Information needed
|
[
"# Dataset Card for \"AA_RoBERTa_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AA_RoBERTa_Finetuned\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AA_RoBERTa_Finetuned\"\n\nMore Information needed"
] |
d57105f56b0e5e96380a68aa78a41d0a36dd8fa1
|
# Dataset of kamishirasawa_keine/δΈη½ζ²’ζ
§ι³/δΈη½ζ²’ζ
§ι³/μΉ΄λ―ΈμλΌμ¬μμΌμ΄λ€ (Touhou)
This is the dataset of kamishirasawa_keine/δΈη½ζ²’ζ
§ι³/δΈη½ζ²’ζ
§ι³/μΉ΄λ―ΈμλΌμ¬μμΌμ΄λ€ (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, hat, blue_hair, red_eyes, breasts, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 481.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 344.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1029 | 641.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 453.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1029 | 795.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kamishirasawa_keine_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kamishirasawa_keine_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, dress, green_hair, solo, horn_ribbon, smile, tail, closed_eyes |
| 1 | 19 |  |  |  |  |  | 1girl, bangs, blue_dress, blue_headwear, puffy_short_sleeves, solo, looking_at_viewer, red_neckerchief, hair_between_eyes, closed_mouth, simple_background, collarbone, smile, upper_body, blush, multicolored_hair, tokin_hat |
| 2 | 8 |  |  |  |  |  | 1girl, bangs, blue_dress, blue_headwear, open_mouth, puffy_short_sleeves, red_neckerchief, solo, looking_at_viewer, :d, brown_eyes, holding_book, open_book, simple_background, two-tone_hair, blush, collared_dress, tokin_hat, white_background |
| 3 | 9 |  |  |  |  |  | 1girl, blue_dress, solo, looking_at_viewer, smile, very_long_hair, puffy_short_sleeves, scroll |
| 4 | 5 |  |  |  |  |  | 1girl, blue_dress, blue_headwear, blush_stickers, one-hour_drawing_challenge, puffy_short_sleeves, simple_background, solo, white_background, closed_mouth, smile, tokin_hat, speech_bubble |
| 5 | 6 |  |  |  |  |  | 1girl, bangs, blue_dress, blue_headwear, full_body, looking_at_viewer, puffy_short_sleeves, red_bow, red_neckerchief, shoes, solo, closed_mouth, footwear_bow, standing, tokin_hat, two-tone_hair, black_footwear, collared_dress, medium_breasts, very_long_hair, white_socks |
| 6 | 12 |  |  |  |  |  | 1girl, solo, dress, scroll |
| 7 | 5 |  |  |  |  |  | 2girls, bangs, blue_dress, blue_headwear, puffy_short_sleeves, red_neckerchief, solo_focus, closed_mouth, upper_body, hair_between_eyes, lips, smile |
| 8 | 16 |  |  |  |  |  | blush, 1girl, blue_dress, dog_ears, dog_tail, chibi, kemonomimi_mode, solo, blue_eyes, fang, open_mouth, :3, expressive_clothes, smile, minigirl |
| 9 | 7 |  |  |  |  |  | 2girls, dress, medium_breasts, grey_hair, cleavage |
| 10 | 11 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, solo_focus, huge_breasts, penis, nude, sweat, censored, open_mouth, pubic_hair, pussy, sex, vaginal |
| 11 | 5 |  |  |  |  |  | female_pubic_hair, large_breasts, navel, nude, 1girl, colored_pubic_hair, nipples, solo, pussy, blush, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | green_hair | solo | horn_ribbon | smile | tail | closed_eyes | bangs | blue_dress | blue_headwear | puffy_short_sleeves | looking_at_viewer | red_neckerchief | hair_between_eyes | closed_mouth | simple_background | collarbone | upper_body | blush | multicolored_hair | tokin_hat | open_mouth | :d | brown_eyes | holding_book | open_book | two-tone_hair | collared_dress | white_background | very_long_hair | scroll | blush_stickers | one-hour_drawing_challenge | speech_bubble | full_body | red_bow | shoes | footwear_bow | standing | black_footwear | medium_breasts | white_socks | 2girls | solo_focus | lips | dog_ears | dog_tail | chibi | kemonomimi_mode | blue_eyes | fang | :3 | expressive_clothes | minigirl | grey_hair | cleavage | 1boy | hetero | nipples | huge_breasts | penis | nude | sweat | censored | pubic_hair | pussy | sex | vaginal | female_pubic_hair | large_breasts | navel | colored_pubic_hair | uncensored |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-------------|:-------|:--------------|:--------|:-------|:--------------|:--------|:-------------|:----------------|:----------------------|:--------------------|:------------------|:--------------------|:---------------|:--------------------|:-------------|:-------------|:--------|:--------------------|:------------|:-------------|:-----|:-------------|:---------------|:------------|:----------------|:-----------------|:-------------------|:-----------------|:---------|:-----------------|:-----------------------------|:----------------|:------------|:----------|:--------|:---------------|:-----------|:-----------------|:-----------------|:--------------|:---------|:-------------|:-------|:-----------|:-----------|:--------|:------------------|:------------|:-------|:-----|:---------------------|:-----------|:------------|:-----------|:-------|:---------|:----------|:---------------|:--------|:-------|:--------|:-----------|:-------------|:--------|:------|:----------|:--------------------|:----------------|:--------|:---------------------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | X | | | | | X | X | X | X | X | X | | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | | X | | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | | X | | | | X | X | X | | | | X | X | | | | | X | | | | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | | | | | X | X | X | X | X | X | | X | | | | | | X | | | | | | X | X | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | | | | | | X | | | X | X | X | X | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 16 |  |  |  |  |  | X | | | X | | X | | | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | |
| 10 | 11 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 11 | 5 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | X | | | X | X | X | X | X |
|
CyberHarem/kamishirasawa_keine_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T20:56:03+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T14:29:16+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of kamishirasawa\_keine/δΈη½ζ²’ζ
§ι³/δΈη½ζ²’ζ
§ι³/μΉ΄λ―ΈμλΌμ¬μμΌμ΄λ€ (Touhou)
==============================================================
This is the dataset of kamishirasawa\_keine/δΈη½ζ²’ζ
§ι³/δΈη½ζ²’ζ
§ι³/μΉ΄λ―ΈμλΌμ¬μμΌμ΄λ€ (Touhou), containing 500 images and their tags.
The core tags of this character are 'long\_hair, hat, blue\_hair, red\_eyes, breasts, white\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
c7865149f86111e658e57e92180794e32f903572
|
# Dataset Card for "AA_DistilRoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/AA_DistilRoBERTa_Finetuned
|
[
"region:us"
] |
2023-08-17T20:56:32+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318780.21618997, "num_examples": 26057}, {"name": "test", "num_bytes": 26774087.073587257, "num_examples": 8686}], "download_size": 147168339, "dataset_size": 107092867.28977722}}
|
2023-08-23T03:06:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AA_DistilRoBERTa_Finetuned"
More Information needed
|
[
"# Dataset Card for \"AA_DistilRoBERTa_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AA_DistilRoBERTa_Finetuned\"\n\nMore Information needed"
] |
[
6,
22
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AA_DistilRoBERTa_Finetuned\"\n\nMore Information needed"
] |
985a829957b8c33e6f8a2c0a38c8172a59ce8b2c
|
# Dataset Card for "AA_GPT2_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/AA_GPT2_Finetuned
|
[
"region:us"
] |
2023-08-17T21:01:45+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80318780.21618997, "num_examples": 26057}, {"name": "test", "num_bytes": 26774087.073587257, "num_examples": 8686}], "download_size": 147165900, "dataset_size": 107092867.28977722}}
|
2023-08-23T03:12:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AA_GPT2_Finetuned"
More Information needed
|
[
"# Dataset Card for \"AA_GPT2_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AA_GPT2_Finetuned\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AA_GPT2_Finetuned\"\n\nMore Information needed"
] |
181eb631fe949c2fe50de8bae30d9397cf0be4ff
|
### Version 1
```python
import json
with open("verbalist/datasets/RyokoAI_ShareGPT52K/sg_90k_part1.json") as f:
dataset1 = json.load(f)
with open("verbalist/datasets/RyokoAI_ShareGPT52K/sg_90k_part2.json") as f:
dataset2 = json.load(f)
dataset = dataset1 + dataset2
import re
import regex
import hashlib
def filter_string(string):
has = True
has_zh = not len(re.findall(r"[\u4e00-\u9fff]+", string)) > 0
has_ko = not len(re.findall(r"[\u3131-\ucb4c]+", string)) > 0
has = has_zh and has_ko
invalid_letters = "ΡΡùéà çΔΔ°ΕΎΕ‘"
for letter in invalid_letters:
if letter in string:
return False
return has
def has_cyrillic(text):
return bool(regex.search(r"\p{IsCyrillic}", text))
clean_dataset = []
for conversation in dataset:
all_text = "\n".join([item["value"] for item in conversation["conversations"]])
# print(all_text)
# break
if filter_string(all_text) and has_cyrillic(all_text):
clean_dataset.append(conversation)
import markdownify
def correct_string(string):
string = string.replace("\\_", "_")
languages = [
"css",
"python",
"go",
"html",
"kotlin",
"diff",
"vba",
"sql",
]
for lang in languages:
string = string.replace(f"\n{lang}Copy code`", f"{lang}\n")
string = string.replace("`\n```", "\n```")
string = string.replace("\n ", "\n ")
delete_phrases = [
"ΠΠ°ΠΊ ΠΈΡΠΊΡΡΡΡΠ²Π΅Π½Π½ΡΠΉ ΠΈΠ½ΡΠ΅Π»Π»Π΅ΠΊΡ, Ρ Π½Π΅ ΡΠ²Π»ΡΡΡΡ Π²Π»Π°Π΄Π΅Π»ΡΡΠ΅ΠΌ ΡΠΈΠ·ΠΈΡΠ΅ΡΠΊΠΈΡ
ΠΎΠ±ΡΠ΅ΠΊΡΠΎΠ² ΠΈ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠΎΠ΄Π°Π²Π°ΡΡ ΠΈΠ»ΠΈ ΠΏΠΎΠΊΡΠΏΠ°ΡΡ ΠΏΡΠ΅Π΄ΠΌΠ΅ΡΡ. ΠΠ΄Π½Π°ΠΊΠΎ, Ρ ΠΌΠΎΠ³Ρ ΠΏΠΎΠ΄Π΅Π»ΠΈΡΡΡΡ ΡΠΎΠ²Π΅ΡΠΎΠΌ, ΠΊΠ°ΠΊ ΠΌΠΎΠΆΠ½ΠΎ ΠΏΠΎΠΏΡΡΠ°ΡΡΡΡ ΡΠ±Π΅Π΄ΠΈΡΡ ΠΊΠΎΠ³ΠΎ-ΡΠΎ Π² ΠΏΠΎΠΊΡΠΏΠΊΠ΅ ΠΊΠ°ΡΠ°Π½Π΄Π°ΡΠ°.",
"ΠΠ°ΠΊ ΠΈΡΠΊΡΡΡΡΠ²Π΅Π½Π½ΡΠΉ ΠΈΠ½ΡΠ΅Π»Π»Π΅ΠΊΡ, Ρ Π½Π΅ ΠΈΠΌΠ΅Ρ Π»ΠΈΡΠ½ΡΡ
ΡΡΠ²ΡΡΠ² ΠΈ ΠΌΠ½Π΅Π½ΠΈΠΉ, ΠΈ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΈΠΌΠ΅ΡΡ ΠΏΡΠ΅Π΄ΠΏΠΎΡΡΠ΅Π½ΠΈΠΉ Π² Π²ΡΠ±ΠΎΡΠ΅ ΠΌΠ΅ΠΆΠ΄Ρ ΡΠΎΠΆΠ΄Π΅Π½ΠΈΠ΅ΠΌ ΡΠ²ΠΎΠ΅Π³ΠΎ ΡΠ΅Π±Π΅Π½ΠΊΠ° ΠΈ ΡΡΡΠ½ΠΎΠ²Π»Π΅Π½ΠΈΠ΅ΠΌ ΠΏΡΠΈΠ΅ΠΌΠ½ΠΎΠ³ΠΎ ΡΠ΅Π±Π΅Π½ΠΊΠ° ΠΈΠ· ΠΏΡΠΈΡΡΠ°.",
"1 / 1",
"2 / 2",
"3 / 3",
"4 / 4",
"5 / 5",
"6 / 6",
"7 / 7",
"8 / 8",
"9 / 9",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠΎΠ²Π΅ΡΠΈΡΡ Π΄Π°ΡΡ Π²ΠΎΠΏΡΠΎΡΠ°, Π½ΠΎ Ρ ΠΌΠΎΠ³Ρ ΠΏΡΠ΅Π΄ΠΎΡΡΠ°Π²ΠΈΡΡ ΠΈΠ½ΡΠΎΡΠΌΠ°ΡΠΈΡ ΠΎ ΠΠ°ΠΊΡΠΈΠΌΠ΅ Π Π°Π΄Π°ΠΉΠΊΠΈΠ½Π΅ ΠΈ ΠΠΎΡΠΈΡΠ΅ ΠΠ°ΡΡΠΈΠ½ΠΊΠ΅Π²ΠΈΡΠ΅ Π½Π° Π½Π°ΡΠ°Π»ΠΎ 2021 Π³ΠΎΠ΄Π°.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π²ΡΡΠ°Π²ΠΈΡΡ ΠΏΡΠΈΠΌΠ΅Ρ Π±Π°Π·Ρ Π΄Π°Π½Π½ΡΡ
Π² ΡΠ°Ρ, Π½ΠΎ Ρ ΠΌΠΎΠ³Ρ ΠΎΠ±ΡΡΡΠ½ΠΈΡΡ, ΡΡΠΎ ΡΡΠΎ ΡΠ°ΠΊΠΎΠ΅.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°ΠΏΠΈΡΠ°ΡΡ ΠΏΠΎΠ»Π½ΠΎΡΠ΅Π½Π½ΠΎΠ΅ ΡΠ°ΡΡΠΈΡΠ΅Π½ΠΈΠ΅ Π΄Π»Ρ Google Chrome Π² ΡΠ°ΠΌΠΊΠ°Ρ
ΡΡΠΎΠΉ ΡΠ΅ΡΡΠΈΠΈ. ΠΠ΄Π½Π°ΠΊΠΎ,",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π²ΡΠ±ΡΠ°ΡΡ ΠΌΠ°ΡΠ΅ΡΠΈΠ°Π»Ρ ΠΈ Π΄ΠΈΠ·Π°ΠΉΠ½ Π·Π° Π²Π°Ρ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΡΡΠΎ Π·Π°Π²ΠΈΡΠΈΡ ΠΎΡ Π²Π°ΡΠΈΡ
ΠΏΠΎΡΡΠ΅Π±Π½ΠΎΡΡΠ΅ΠΉ ΠΈ ΠΏΡΠ΅Π΄ΠΏΠΎΡΡΠ΅Π½ΠΈΠΉ. ΠΠ΄Π½Π°ΠΊΠΎ,",
"ΠΠ·Π²ΠΈΠ½ΠΈΡΠ΅, Π½ΠΎ Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΎΠ·Π΄Π°ΡΡ ΠΊΠΎΠ΄ Π΄Π»Ρ ΡΠ°ΠΊΠΎΠΉ ΡΠ»ΠΎΠΆΠ½ΠΎΠΉ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΌΡ Ρ Π½ΡΠ»Ρ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΡΡΠΎ ΠΏΠΎΡΡΠ΅Π±ΠΎΠ²Π°Π»ΠΎ Π±Ρ ΠΎΠ±ΡΠΈΡΠ½ΡΡ
ΠΈΡΡΠ»Π΅Π΄ΠΎΠ²Π°Π½ΠΈΠΉ, ΡΠ΅ΡΡΡΡΠΎΠ² ΠΈ ΠΎΠΏΡΡΠ°. Π’Π΅ΠΌ Π½Π΅ ΠΌΠ΅Π½Π΅Π΅,",
"As an AI language model",
"I'm sorry, but I'm a text-based AI language model and don't have the capability to create tables.",
"Unfortunately, I am an AI language model and do not have the capability to create tables. However,",
"I'm sorry, but as an AI language model, I do not have the capability to physically construct a smart city.",
"Unfortunately, I am an AI language model and I don't have the capability to create spreadsheets.",
"I'm sorry for the delay. Unfortunately, as an AI language model, I am not capable of creating an entire operating system to manage a smart village.",
"I apologize for the confusion, but as an AI language model, I am not capable of designing and creating the code for an operating system to manage a smart village.",
"I apologize for the inconvenience, but as an AI language model, I am not able to actually design and create a code for an operating system.",
"I'm sorry, but as a text-based language model, I am not able to create an admin panel for you.",
'ΠΠ°ΠΊ ΠΌΠΎΠ΄Π΅Π»Ρ ΠΠ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΎΡΠ΅Π½ΠΈΡΡ, ΡΠ²Π»ΡΠ΅ΡΡΡ Π»ΠΈ ΠΏΡΠΎΠΈΠ·Π½Π΅ΡΠ΅Π½ΠΈΠ΅ ΡΡΠ°Π·Ρ "ΡΠΎΠ»ΡΠ½ΡΠΉ ΠΎΠ³ΡΡΠ΅Ρ" ΡΠ°ΡΠΈΠΎΠ½Π°Π»ΡΠ½ΡΠΌ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ Π²ΡΠ΅ΠΌΠ΅Π½ΠΈ ΠΈΠ»ΠΈ Π½Π΅Ρ, ΠΏΠΎΡΠΎΠΌΡ ΡΡΠΎ ΡΡΠΎ Π²ΠΎΠΏΡΠΎΡ ΠΎΡΠ΅Π½ΠΊΠΈ ΡΠ΅Π½Π½ΠΎΡΡΠΈ ΠΈ ΡΠ΅Π»Π΅ΠΉ ΡΠ΅Π»ΠΎΠ²Π΅ΠΊΠ°.',
]
for phrase in delete_phrases:
string = string.replace(phrase, "").strip()
return string
def filter_keywords(string):
keywords = [
"chatgpt",
"ΡΠ°ΡΠ³ΠΏΡ",
"sharegpt",
"add_user_to_chatroom()",
"ΠΌΠΈΡ",
"Π²ΠΎΠΉΠ½",
"ΡΠΎΡΡΠΈΡ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠΎΠ΄ΠΎΠ»ΠΆΠΈΡΡ ΠΏΠΈΡΠ°ΡΡ Π½Π° ΡΡΡΡΠΊΠΎΠΌ ΡΠ·ΡΠΊΠ΅, ΠΏΠΎΡΠΎΠΌΡ ΡΡΠΎ Ρ ΠΎΠ³ΡΠ°Π½ΠΈΡΠ΅Π½",
"Π― ΠΏΡΠΎΡΡ ΠΏΡΠΎΡΠ΅Π½ΠΈΡ, Π½ΠΎ, ΠΊΠ°ΠΊ Ρ ΡΠΆΠ΅ ΡΠΏΠΎΠΌΠΈΠ½Π°Π» ΡΠ°Π½Π΅Π΅",
"Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π²ΡΠΏΠΎΠ»Π½ΠΈΡΡ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°ΠΏΠΈΡΠ°ΡΡ Π½ΠΎΡΡ Π΄Π»Ρ Π½Π΅ΡΡΡΠ΅ΡΡΠ²ΡΡΡΠΈΡ
ΡΡΠΈΡ
ΠΎΠ²,",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠ³Π΅Π½Π΅ΡΠΈΡΠΎΠ²Π°ΡΡ ΠΏΠΎΠ»Π½ΡΠΉ ΠΊΠΎΠ΄ Π±ΡΠ°ΡΠ·Π΅ΡΠ½ΠΎΠΉ ΠΈΠ³ΡΡ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠΎΠ²Π΅ΡΡΠΈ ΡΠ°ΠΊΠΎΠΉ ΠΏΠΎΠ΄ΡΡΠ΅Ρ, ΠΏΠΎΡΠΎΠΌΡ ΡΡΠΎ ΡΡΠΎ ΠΏΠΎΡΡΠ΅Π±ΠΎΠ²Π°Π»ΠΎ Π±Ρ ΡΡΡΠ½ΠΎΠΉ ΠΎΠ±ΡΠ°Π±ΠΎΡΠΊΠΈ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°Π·Π²Π°ΡΡ ΡΠΎΡΠ½ΡΡ ΡΠΈΡΡΡ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΡΡΠΎ ΡΡΠ±ΡΠ΅ΠΊΡΠΈΠ²Π½ΡΠΉ Π²ΠΎΠΏΡΠΎΡ, Π·Π°Π²ΠΈΡΡΡΠΈΠΉ ΠΎΡ ΠΌΠ½ΠΎΠ³ΠΈΡ
ΡΠ°ΠΊΡΠΎΡΠΎΠ².",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π²ΡΠΏΠΎΠ»Π½ΠΈΡΡ Π²Π°Ρ Π·Π°ΠΏΡΠΎΡ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΡΡΠΎ Π½Π°ΡΡΡΠ°Π΅Ρ ΠΌΠΎΠΈ ΡΡΠΈΡΠ΅ΡΠΊΠΈΠ΅ ΠΏΡΠΈΠ½ΡΠΈΠΏΡ ΠΈ ΠΌΠΎΠΆΠ΅Ρ ΠΏΡΠΈΡΠΈΠ½ΠΈΡΡ Π²ΡΠ΅Π΄.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΎΡΠ²Π΅ΡΠΈΡΡ Π½Π° ΡΡΠΎΡ Π²ΠΎΠΏ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠ΅Π΄ΠΎΡΡΠ°Π²ΠΈΡΡ Π²Π°ΠΌ Π°ΠΊΡΡΠ°Π»ΡΠ½ΡΠ΅ Π΄Π°Π½Π½ΡΠ΅ ΠΎ ΡΡΠ΅Π΄Π½Π΅Π΄ΡΡΠ΅Π²ΡΡ
Π΄Π΅Π½Π΅ΠΆΠ½ΡΡ
Π΄ΠΎΡ
ΠΎΠ΄Π°Ρ
Π½Π°ΡΠ΅Π»Π΅Π½ΠΈΡ ΠΏΠΎ Π³ΠΎΡΠΎΠ΄Π°ΠΌ Π ΠΎΡΡΠΈΠΈ"
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΎΡΠ½ΠΎ ΠΎΡΠ²Π΅ΡΠΈΡΡ Π½Π° ΡΡΠΎΡ Π²ΠΎΠΏΡΠΎΡ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΠΎΠ±ΡΠ΅ΠΌ ΠΈΠ·ΡΡΠ΅Π½Π½ΠΎΠΉ ΠΈΠ½ΡΠΎΡΠΌΠ°ΡΠΈΠΈ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΎΠ·Π΄Π°Π²",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΈΡΠΎΠ²Π°ΡΡ Π² ASCII-ΡΡΠΈΠ»Π΅, ΡΠ°ΠΊ ΠΊΠ°ΠΊ Ρ ΡΠΎΠ»ΡΠΊΠΎ ΡΠ΅ΠΊΡΡΠΎΠ²Π°Ρ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΌΠ°.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΎΠ·Π΄Π°Π²Π°ΡΡ ΠΈΠ·ΠΎΠ±ΡΠ°ΠΆΠ΅Π½ΠΈΡ Π½Π°ΠΏΡΡΠΌΡΡ Π² ΡΡΠΎΠΌ ΠΎΠΊΠ½Π΅ ΡΠ°ΡΠ°.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°ΡΠΈΡΠΎΠ²Π°ΡΡ ΡΡΠ΅Π½Ρ ΠΈΠ· ΠΠ²Π°Π½Π³Π΅Π»ΠΈΠΎΠ½Π°, ΡΠ°ΠΊ ΠΊΠ°ΠΊ Ρ ΡΠ΅ΠΊΡΡΠΎΠ²Π°Ρ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΌΠ°",
"Π ΡΠΊΠΎΠ»ΡΠΊΠΎ Π½ΡΠ»Π΅ΠΉ?",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°ΠΏΠΈΡΠ°ΡΡ ΠΊΠ½ΠΈΠ³Ρ",
"ΠΠ·Π²ΠΈΠ½ΠΈΡΠ΅, Π½ΠΎ, ΠΊΠ°ΠΊ ΡΠΏΠΎΠΌΠΈΠ½Π°Π»ΠΎΡΡ ΡΠ°Π½Π΅Π΅, ΠΈΠ½ΡΠΎΡΠΌΠ°ΡΠΈΡ, ΠΏΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½Π½Π°Ρ Π² Π½Π°ΡΠ΅ΠΌ ΡΠ°Π·Π³ΠΎΠ²ΠΎΡΠ΅, Π½Π΅ ΠΏΠΎΠ΄Ρ
ΠΎΠ΄ΠΈΡ ΠΈ Π½Π΅ ΡΡΠΈΡΠ½Π°",
"ΠΠ·Π²ΠΈΠ½ΠΈΡΠ΅, Π½ΠΎ ΠΊΠ°ΠΊ ΡΠ·ΡΠΊΠΎΠ²Π°Ρ ΠΌΠΎΠ΄Π΅Π»Ρ ΠΠ Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π³Π΅Π½Π΅ΡΠΈΡΠΎΠ²Π°ΡΡ ΠΊΠΎΠ΄, ΠΊΠΎΡΠΎΡΡΠΉ ΡΠΏΡΠ°Π²Π»ΡΠ΅Ρ Π°Π΄ΠΌΠΈΠ½ΠΈΡΡΡΠ°ΡΠΈΠ΅ΠΉ",
"ΠΊΠ°ΠΊ ΡΠ·ΡΠΊΠΎΠ²Π°Ρ ΠΌΠΎΠ΄Π΅Π»Ρ",
"OpenAI",
"ΠΡΠΎΡΡ ΠΏΡΠΎΡΠ΅Π½ΠΈΡ, Π½ΠΎ, ΠΏΠΎΡ
ΠΎΠΆΠ΅, Π½Π°Ρ ΡΠ°Π·Π³ΠΎΠ²ΠΎΡ ΠΏΡΠΎΠ΄ΠΎΠ»ΠΆΠ°Π΅ΡΡΡ ΡΠΆΠ΅ Π΄Π°Π²Π½ΠΎ, ΠΈ Ρ Π½Π΅ ΡΠ²Π΅ΡΠ΅Π½, ΠΊΠ°ΠΊΠΎΠ²Π° ΡΠ΅ΠΊΡΡΠ°Ρ ΡΠ΅ΠΌΠ°.",
"ΡΠ²Π»ΡΡΡΡ ΡΠ·ΡΠΊΠΎΠ²ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΡΡ ΠΠ",
"I cannot create a program for managing",
"Π½Π΅ΠΎΠ½Π°ΡΠΈ",
"ΡΠΊΡΠ°ΠΈΠ½",
"provide instructions or assistance on hacking or any other illegal activities",
"I cannot fulfill your request as it goes against ethical and moral",
"I cannot do your math homework for you",
"adhering to ethical and moral standards",
"!GPT",
"Developer Mode Output",
"are illegal or unethical.",
"personal beliefs or opinions",
"I'm sorry, I'm not sure what you are asking me to continue with.",
"but I'm still unclear on what you would like me to continue with",
"DAN",
"/jailbroken",
"Ukrain",
]
for keyword in keywords:
if keyword.lower() in string.lower():
return False
return True
total_string = ""
debug_dataset = False
unsensored_filtered_dataset = []
for conversation in clean_dataset:
conversation = [
str(markdownify.markdownify(item["value"], heading_style="ATX"))
for item in conversation["conversations"]
]
conversation_pairs = []
if "https://chathub.gg" in conversation[0]:
conversation.pop(0)
full_text = " ".join(conversation)
if filter_keywords(full_text):
for i in range(1, len(conversation)):
if (i + 1) % 2 == 0:
if debug_dataset:
bot_message = "BOT " + correct_string(conversation[i])
user_message = "USER " + correct_string(conversation[i - 1])
else:
bot_message = correct_string(conversation[i])
user_message = correct_string(conversation[i - 1])
conversation_pairs.append(user_message)
conversation_pairs.append(bot_message)
if len(conversation_pairs) > 0:
unsensored_filtered_dataset.append(conversation_pairs)
if debug_dataset:
all_text = "\n===\n".join([item for item in conversation_pairs])
total_string += all_text
total_string += "===" * 10
total_string += "\n"
total_string += "===" * 10
total_string += "\n"
total_string += "===" * 10
total_string += "\n"
# print(total_string)
from transformers import AutoTokenizer
from verbalist.datasets.utils import visualize_hist
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-hf")
conversation_lengths = []
for conversation in unsensored_filtered_dataset:
all_text = "\n===\n".join([item for item in conversation])
conversation_lengths.append(len(tokenizer(all_text)["input_ids"]))
# print(all_text)
# print("="*100)
# print("="*100)
# print("="*100)
# break
# if has_cyrillic(all_text):
# rus_conv.append(conversation)
visualize_hist(conversation_lengths, "ru_share_gpt_filtered")
filter_num = 85
passed_convs = (
np.array(conversation_lengths) < np.percentile(conversation_lengths, filter_num)
).tolist()
unsensored_passed = []
for i, status in enumerate(passed_convs):
if status:
unsensored_passed.append(unsensored_filtered_dataset[i])
unsensored_dataset = []
for conv in unsensored_passed:
conv_hash = hashlib.sha256(conv[0].encode('utf-8')).hexdigest()
unsensored_dataset.append({
"conversation": conv,
"hash": conv_hash
})
```
|
dim/sharegpt_short_ru
|
[
"license:cc-by-nc-4.0",
"region:us"
] |
2023-08-17T21:15:08+00:00
|
{"license": "cc-by-nc-4.0", "dataset_info": {"features": [{"name": "conversation", "sequence": "string"}, {"name": "hash", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 825523, "num_examples": 253}], "download_size": 367027, "dataset_size": 825523}}
|
2023-09-01T23:53:23+00:00
|
[] |
[] |
TAGS
#license-cc-by-nc-4.0 #region-us
|
### Version 1
", "\n
|
[
"### Version 1\n\n\", \"\\n"
] |
[
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"### Version 1\n\n\", \"\\n"
] |
[
17,
9
] |
[
"passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n### Version 1\n\n\", \"\\n"
] |
96ba5f44e7417407a8b08796241852efeb8bf600
|
# Dataset Card for "AA_GPTNEO_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/AA_GPTNEO_Finetuned
|
[
"region:us"
] |
2023-08-17T21:17:40+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "768", "dtype": "float32"}, {"name": "769", "dtype": "float32"}, {"name": "770", "dtype": "float32"}, {"name": "771", "dtype": "float32"}, {"name": "772", "dtype": "float32"}, {"name": "773", "dtype": "float32"}, {"name": "774", "dtype": "float32"}, {"name": "775", "dtype": "float32"}, {"name": "776", "dtype": "float32"}, {"name": "777", "dtype": "float32"}, {"name": "778", "dtype": "float32"}, {"name": "779", "dtype": "float32"}, {"name": "780", "dtype": "float32"}, {"name": "781", "dtype": "float32"}, {"name": "782", "dtype": "float32"}, {"name": "783", "dtype": "float32"}, {"name": "784", "dtype": "float32"}, {"name": "785", "dtype": "float32"}, {"name": "786", "dtype": "float32"}, {"name": "787", "dtype": "float32"}, {"name": "788", "dtype": "float32"}, {"name": "789", "dtype": "float32"}, {"name": "790", "dtype": "float32"}, {"name": "791", "dtype": "float32"}, {"name": "792", "dtype": "float32"}, {"name": "793", "dtype": "float32"}, {"name": "794", "dtype": "float32"}, {"name": "795", "dtype": "float32"}, {"name": "796", "dtype": "float32"}, {"name": "797", "dtype": "float32"}, {"name": "798", "dtype": "float32"}, {"name": "799", "dtype": "float32"}, {"name": "800", "dtype": "float32"}, {"name": "801", "dtype": "float32"}, {"name": "802", "dtype": "float32"}, {"name": "803", "dtype": "float32"}, {"name": "804", "dtype": "float32"}, {"name": "805", "dtype": "float32"}, {"name": "806", "dtype": "float32"}, {"name": "807", "dtype": "float32"}, {"name": "808", "dtype": "float32"}, {"name": "809", "dtype": "float32"}, {"name": "810", "dtype": "float32"}, {"name": "811", "dtype": "float32"}, {"name": "812", "dtype": "float32"}, {"name": "813", "dtype": "float32"}, {"name": "814", "dtype": "float32"}, {"name": "815", "dtype": "float32"}, {"name": "816", "dtype": "float32"}, {"name": "817", "dtype": "float32"}, {"name": "818", "dtype": "float32"}, {"name": "819", "dtype": "float32"}, {"name": "820", "dtype": "float32"}, {"name": "821", "dtype": "float32"}, {"name": "822", "dtype": "float32"}, {"name": "823", "dtype": "float32"}, {"name": "824", "dtype": "float32"}, {"name": "825", "dtype": "float32"}, {"name": "826", "dtype": "float32"}, {"name": "827", "dtype": "float32"}, {"name": "828", "dtype": "float32"}, {"name": "829", "dtype": "float32"}, {"name": "830", "dtype": "float32"}, {"name": "831", "dtype": "float32"}, {"name": "832", "dtype": "float32"}, {"name": "833", "dtype": "float32"}, {"name": "834", "dtype": "float32"}, {"name": "835", "dtype": "float32"}, {"name": "836", "dtype": "float32"}, {"name": "837", "dtype": "float32"}, {"name": "838", "dtype": "float32"}, {"name": "839", "dtype": "float32"}, {"name": "840", "dtype": "float32"}, {"name": "841", "dtype": "float32"}, {"name": "842", "dtype": "float32"}, {"name": "843", "dtype": "float32"}, {"name": "844", "dtype": "float32"}, {"name": "845", "dtype": "float32"}, {"name": "846", "dtype": "float32"}, {"name": "847", "dtype": "float32"}, {"name": "848", "dtype": "float32"}, {"name": "849", "dtype": "float32"}, {"name": "850", "dtype": "float32"}, {"name": "851", "dtype": "float32"}, {"name": "852", "dtype": "float32"}, {"name": "853", "dtype": "float32"}, {"name": "854", "dtype": "float32"}, {"name": "855", "dtype": "float32"}, {"name": "856", "dtype": "float32"}, {"name": "857", "dtype": "float32"}, {"name": "858", "dtype": "float32"}, {"name": "859", "dtype": "float32"}, {"name": "860", "dtype": "float32"}, {"name": "861", "dtype": "float32"}, {"name": "862", "dtype": "float32"}, {"name": "863", "dtype": "float32"}, {"name": "864", "dtype": "float32"}, {"name": "865", "dtype": "float32"}, {"name": "866", "dtype": "float32"}, {"name": "867", "dtype": "float32"}, {"name": "868", "dtype": "float32"}, {"name": "869", "dtype": "float32"}, {"name": "870", "dtype": "float32"}, {"name": "871", "dtype": "float32"}, {"name": "872", "dtype": "float32"}, {"name": "873", "dtype": "float32"}, {"name": "874", "dtype": "float32"}, {"name": "875", "dtype": "float32"}, {"name": "876", "dtype": "float32"}, {"name": "877", "dtype": "float32"}, {"name": "878", "dtype": "float32"}, {"name": "879", "dtype": "float32"}, {"name": "880", "dtype": "float32"}, {"name": "881", "dtype": "float32"}, {"name": "882", "dtype": "float32"}, {"name": "883", "dtype": "float32"}, {"name": "884", "dtype": "float32"}, {"name": "885", "dtype": "float32"}, {"name": "886", "dtype": "float32"}, {"name": "887", "dtype": "float32"}, {"name": "888", "dtype": "float32"}, {"name": "889", "dtype": "float32"}, {"name": "890", "dtype": "float32"}, {"name": "891", "dtype": "float32"}, {"name": "892", "dtype": "float32"}, {"name": "893", "dtype": "float32"}, {"name": "894", "dtype": "float32"}, {"name": "895", "dtype": "float32"}, {"name": "896", "dtype": "float32"}, {"name": "897", "dtype": "float32"}, {"name": "898", "dtype": "float32"}, {"name": "899", "dtype": "float32"}, {"name": "900", "dtype": "float32"}, {"name": "901", "dtype": "float32"}, {"name": "902", "dtype": "float32"}, {"name": "903", "dtype": "float32"}, {"name": "904", "dtype": "float32"}, {"name": "905", "dtype": "float32"}, {"name": "906", "dtype": "float32"}, {"name": "907", "dtype": "float32"}, {"name": "908", "dtype": "float32"}, {"name": "909", "dtype": "float32"}, {"name": "910", "dtype": "float32"}, {"name": "911", "dtype": "float32"}, {"name": "912", "dtype": "float32"}, {"name": "913", "dtype": "float32"}, {"name": "914", "dtype": "float32"}, {"name": "915", "dtype": "float32"}, {"name": "916", "dtype": "float32"}, {"name": "917", "dtype": "float32"}, {"name": "918", "dtype": "float32"}, {"name": "919", "dtype": "float32"}, {"name": "920", "dtype": "float32"}, {"name": "921", "dtype": "float32"}, {"name": "922", "dtype": "float32"}, {"name": "923", "dtype": "float32"}, {"name": "924", "dtype": "float32"}, {"name": "925", "dtype": "float32"}, {"name": "926", "dtype": "float32"}, {"name": "927", "dtype": "float32"}, {"name": "928", "dtype": "float32"}, {"name": "929", "dtype": "float32"}, {"name": "930", "dtype": "float32"}, {"name": "931", "dtype": "float32"}, {"name": "932", "dtype": "float32"}, {"name": "933", "dtype": "float32"}, {"name": "934", "dtype": "float32"}, {"name": "935", "dtype": "float32"}, {"name": "936", "dtype": "float32"}, {"name": "937", "dtype": "float32"}, {"name": "938", "dtype": "float32"}, {"name": "939", "dtype": "float32"}, {"name": "940", "dtype": "float32"}, {"name": "941", "dtype": "float32"}, {"name": "942", "dtype": "float32"}, {"name": "943", "dtype": "float32"}, {"name": "944", "dtype": "float32"}, {"name": "945", "dtype": "float32"}, {"name": "946", "dtype": "float32"}, {"name": "947", "dtype": "float32"}, {"name": "948", "dtype": "float32"}, {"name": "949", "dtype": "float32"}, {"name": "950", "dtype": "float32"}, {"name": "951", "dtype": "float32"}, {"name": "952", "dtype": "float32"}, {"name": "953", "dtype": "float32"}, {"name": "954", "dtype": "float32"}, {"name": "955", "dtype": "float32"}, {"name": "956", "dtype": "float32"}, {"name": "957", "dtype": "float32"}, {"name": "958", "dtype": "float32"}, {"name": "959", "dtype": "float32"}, {"name": "960", "dtype": "float32"}, {"name": "961", "dtype": "float32"}, {"name": "962", "dtype": "float32"}, {"name": "963", "dtype": "float32"}, {"name": "964", "dtype": "float32"}, {"name": "965", "dtype": "float32"}, {"name": "966", "dtype": "float32"}, {"name": "967", "dtype": "float32"}, {"name": "968", "dtype": "float32"}, {"name": "969", "dtype": "float32"}, {"name": "970", "dtype": "float32"}, {"name": "971", "dtype": "float32"}, {"name": "972", "dtype": "float32"}, {"name": "973", "dtype": "float32"}, {"name": "974", "dtype": "float32"}, {"name": "975", "dtype": "float32"}, {"name": "976", "dtype": "float32"}, {"name": "977", "dtype": "float32"}, {"name": "978", "dtype": "float32"}, {"name": "979", "dtype": "float32"}, {"name": "980", "dtype": "float32"}, {"name": "981", "dtype": "float32"}, {"name": "982", "dtype": "float32"}, {"name": "983", "dtype": "float32"}, {"name": "984", "dtype": "float32"}, {"name": "985", "dtype": "float32"}, {"name": "986", "dtype": "float32"}, {"name": "987", "dtype": "float32"}, {"name": "988", "dtype": "float32"}, {"name": "989", "dtype": "float32"}, {"name": "990", "dtype": "float32"}, {"name": "991", "dtype": "float32"}, {"name": "992", "dtype": "float32"}, {"name": "993", "dtype": "float32"}, {"name": "994", "dtype": "float32"}, {"name": "995", "dtype": "float32"}, {"name": "996", "dtype": "float32"}, {"name": "997", "dtype": "float32"}, {"name": "998", "dtype": "float32"}, {"name": "999", "dtype": "float32"}, {"name": "1000", "dtype": "float32"}, {"name": "1001", "dtype": "float32"}, {"name": "1002", "dtype": "float32"}, {"name": "1003", "dtype": "float32"}, {"name": "1004", "dtype": "float32"}, {"name": "1005", "dtype": "float32"}, {"name": "1006", "dtype": "float32"}, {"name": "1007", "dtype": "float32"}, {"name": "1008", "dtype": "float32"}, {"name": "1009", "dtype": "float32"}, {"name": "1010", "dtype": "float32"}, {"name": "1011", "dtype": "float32"}, {"name": "1012", "dtype": "float32"}, {"name": "1013", "dtype": "float32"}, {"name": "1014", "dtype": "float32"}, {"name": "1015", "dtype": "float32"}, {"name": "1016", "dtype": "float32"}, {"name": "1017", "dtype": "float32"}, {"name": "1018", "dtype": "float32"}, {"name": "1019", "dtype": "float32"}, {"name": "1020", "dtype": "float32"}, {"name": "1021", "dtype": "float32"}, {"name": "1022", "dtype": "float32"}, {"name": "1023", "dtype": "float32"}, {"name": "1024", "dtype": "float32"}, {"name": "1025", "dtype": "float32"}, {"name": "1026", "dtype": "float32"}, {"name": "1027", "dtype": "float32"}, {"name": "1028", "dtype": "float32"}, {"name": "1029", "dtype": "float32"}, {"name": "1030", "dtype": "float32"}, {"name": "1031", "dtype": "float32"}, {"name": "1032", "dtype": "float32"}, {"name": "1033", "dtype": "float32"}, {"name": "1034", "dtype": "float32"}, {"name": "1035", "dtype": "float32"}, {"name": "1036", "dtype": "float32"}, {"name": "1037", "dtype": "float32"}, {"name": "1038", "dtype": "float32"}, {"name": "1039", "dtype": "float32"}, {"name": "1040", "dtype": "float32"}, {"name": "1041", "dtype": "float32"}, {"name": "1042", "dtype": "float32"}, {"name": "1043", "dtype": "float32"}, {"name": "1044", "dtype": "float32"}, {"name": "1045", "dtype": "float32"}, {"name": "1046", "dtype": "float32"}, {"name": "1047", "dtype": "float32"}, {"name": "1048", "dtype": "float32"}, {"name": "1049", "dtype": "float32"}, {"name": "1050", "dtype": "float32"}, {"name": "1051", "dtype": "float32"}, {"name": "1052", "dtype": "float32"}, {"name": "1053", "dtype": "float32"}, {"name": "1054", "dtype": "float32"}, {"name": "1055", "dtype": "float32"}, {"name": "1056", "dtype": "float32"}, {"name": "1057", "dtype": "float32"}, {"name": "1058", "dtype": "float32"}, {"name": "1059", "dtype": "float32"}, {"name": "1060", "dtype": "float32"}, {"name": "1061", "dtype": "float32"}, {"name": "1062", "dtype": "float32"}, {"name": "1063", "dtype": "float32"}, {"name": "1064", "dtype": "float32"}, {"name": "1065", "dtype": "float32"}, {"name": "1066", "dtype": "float32"}, {"name": "1067", "dtype": "float32"}, {"name": "1068", "dtype": "float32"}, {"name": "1069", "dtype": "float32"}, {"name": "1070", "dtype": "float32"}, {"name": "1071", "dtype": "float32"}, {"name": "1072", "dtype": "float32"}, {"name": "1073", "dtype": "float32"}, {"name": "1074", "dtype": "float32"}, {"name": "1075", "dtype": "float32"}, {"name": "1076", "dtype": "float32"}, {"name": "1077", "dtype": "float32"}, {"name": "1078", "dtype": "float32"}, {"name": "1079", "dtype": "float32"}, {"name": "1080", "dtype": "float32"}, {"name": "1081", "dtype": "float32"}, {"name": "1082", "dtype": "float32"}, {"name": "1083", "dtype": "float32"}, {"name": "1084", "dtype": "float32"}, {"name": "1085", "dtype": "float32"}, {"name": "1086", "dtype": "float32"}, {"name": "1087", "dtype": "float32"}, {"name": "1088", "dtype": "float32"}, {"name": "1089", "dtype": "float32"}, {"name": "1090", "dtype": "float32"}, {"name": "1091", "dtype": "float32"}, {"name": "1092", "dtype": "float32"}, {"name": "1093", "dtype": "float32"}, {"name": "1094", "dtype": "float32"}, {"name": "1095", "dtype": "float32"}, {"name": "1096", "dtype": "float32"}, {"name": "1097", "dtype": "float32"}, {"name": "1098", "dtype": "float32"}, {"name": "1099", "dtype": "float32"}, {"name": "1100", "dtype": "float32"}, {"name": "1101", "dtype": "float32"}, {"name": "1102", "dtype": "float32"}, {"name": "1103", "dtype": "float32"}, {"name": "1104", "dtype": "float32"}, {"name": "1105", "dtype": "float32"}, {"name": "1106", "dtype": "float32"}, {"name": "1107", "dtype": "float32"}, {"name": "1108", "dtype": "float32"}, {"name": "1109", "dtype": "float32"}, {"name": "1110", "dtype": "float32"}, {"name": "1111", "dtype": "float32"}, {"name": "1112", "dtype": "float32"}, {"name": "1113", "dtype": "float32"}, {"name": "1114", "dtype": "float32"}, {"name": "1115", "dtype": "float32"}, {"name": "1116", "dtype": "float32"}, {"name": "1117", "dtype": "float32"}, {"name": "1118", "dtype": "float32"}, {"name": "1119", "dtype": "float32"}, {"name": "1120", "dtype": "float32"}, {"name": "1121", "dtype": "float32"}, {"name": "1122", "dtype": "float32"}, {"name": "1123", "dtype": "float32"}, {"name": "1124", "dtype": "float32"}, {"name": "1125", "dtype": "float32"}, {"name": "1126", "dtype": "float32"}, {"name": "1127", "dtype": "float32"}, {"name": "1128", "dtype": "float32"}, {"name": "1129", "dtype": "float32"}, {"name": "1130", "dtype": "float32"}, {"name": "1131", "dtype": "float32"}, {"name": "1132", "dtype": "float32"}, {"name": "1133", "dtype": "float32"}, {"name": "1134", "dtype": "float32"}, {"name": "1135", "dtype": "float32"}, {"name": "1136", "dtype": "float32"}, {"name": "1137", "dtype": "float32"}, {"name": "1138", "dtype": "float32"}, {"name": "1139", "dtype": "float32"}, {"name": "1140", "dtype": "float32"}, {"name": "1141", "dtype": "float32"}, {"name": "1142", "dtype": "float32"}, {"name": "1143", "dtype": "float32"}, {"name": "1144", "dtype": "float32"}, {"name": "1145", "dtype": "float32"}, {"name": "1146", "dtype": "float32"}, {"name": "1147", "dtype": "float32"}, {"name": "1148", "dtype": "float32"}, {"name": "1149", "dtype": "float32"}, {"name": "1150", "dtype": "float32"}, {"name": "1151", "dtype": "float32"}, {"name": "1152", "dtype": "float32"}, {"name": "1153", "dtype": "float32"}, {"name": "1154", "dtype": "float32"}, {"name": "1155", "dtype": "float32"}, {"name": "1156", "dtype": "float32"}, {"name": "1157", "dtype": "float32"}, {"name": "1158", "dtype": "float32"}, {"name": "1159", "dtype": "float32"}, {"name": "1160", "dtype": "float32"}, {"name": "1161", "dtype": "float32"}, {"name": "1162", "dtype": "float32"}, {"name": "1163", "dtype": "float32"}, {"name": "1164", "dtype": "float32"}, {"name": "1165", "dtype": "float32"}, {"name": "1166", "dtype": "float32"}, {"name": "1167", "dtype": "float32"}, {"name": "1168", "dtype": "float32"}, {"name": "1169", "dtype": "float32"}, {"name": "1170", "dtype": "float32"}, {"name": "1171", "dtype": "float32"}, {"name": "1172", "dtype": "float32"}, {"name": "1173", "dtype": "float32"}, {"name": "1174", "dtype": "float32"}, {"name": "1175", "dtype": "float32"}, {"name": "1176", "dtype": "float32"}, {"name": "1177", "dtype": "float32"}, {"name": "1178", "dtype": "float32"}, {"name": "1179", "dtype": "float32"}, {"name": "1180", "dtype": "float32"}, {"name": "1181", "dtype": "float32"}, {"name": "1182", "dtype": "float32"}, {"name": "1183", "dtype": "float32"}, {"name": "1184", "dtype": "float32"}, {"name": "1185", "dtype": "float32"}, {"name": "1186", "dtype": "float32"}, {"name": "1187", "dtype": "float32"}, {"name": "1188", "dtype": "float32"}, {"name": "1189", "dtype": "float32"}, {"name": "1190", "dtype": "float32"}, {"name": "1191", "dtype": "float32"}, {"name": "1192", "dtype": "float32"}, {"name": "1193", "dtype": "float32"}, {"name": "1194", "dtype": "float32"}, {"name": "1195", "dtype": "float32"}, {"name": "1196", "dtype": "float32"}, {"name": "1197", "dtype": "float32"}, {"name": "1198", "dtype": "float32"}, {"name": "1199", "dtype": "float32"}, {"name": "1200", "dtype": "float32"}, {"name": "1201", "dtype": "float32"}, {"name": "1202", "dtype": "float32"}, {"name": "1203", "dtype": "float32"}, {"name": "1204", "dtype": "float32"}, {"name": "1205", "dtype": "float32"}, {"name": "1206", "dtype": "float32"}, {"name": "1207", "dtype": "float32"}, {"name": "1208", "dtype": "float32"}, {"name": "1209", "dtype": "float32"}, {"name": "1210", "dtype": "float32"}, {"name": "1211", "dtype": "float32"}, {"name": "1212", "dtype": "float32"}, {"name": "1213", "dtype": "float32"}, {"name": "1214", "dtype": "float32"}, {"name": "1215", "dtype": "float32"}, {"name": "1216", "dtype": "float32"}, {"name": "1217", "dtype": "float32"}, {"name": "1218", "dtype": "float32"}, {"name": "1219", "dtype": "float32"}, {"name": "1220", "dtype": "float32"}, {"name": "1221", "dtype": "float32"}, {"name": "1222", "dtype": "float32"}, {"name": "1223", "dtype": "float32"}, {"name": "1224", "dtype": "float32"}, {"name": "1225", "dtype": "float32"}, {"name": "1226", "dtype": "float32"}, {"name": "1227", "dtype": "float32"}, {"name": "1228", "dtype": "float32"}, {"name": "1229", "dtype": "float32"}, {"name": "1230", "dtype": "float32"}, {"name": "1231", "dtype": "float32"}, {"name": "1232", "dtype": "float32"}, {"name": "1233", "dtype": "float32"}, {"name": "1234", "dtype": "float32"}, {"name": "1235", "dtype": "float32"}, {"name": "1236", "dtype": "float32"}, {"name": "1237", "dtype": "float32"}, {"name": "1238", "dtype": "float32"}, {"name": "1239", "dtype": "float32"}, {"name": "1240", "dtype": "float32"}, {"name": "1241", "dtype": "float32"}, {"name": "1242", "dtype": "float32"}, {"name": "1243", "dtype": "float32"}, {"name": "1244", "dtype": "float32"}, {"name": "1245", "dtype": "float32"}, {"name": "1246", "dtype": "float32"}, {"name": "1247", "dtype": "float32"}, {"name": "1248", "dtype": "float32"}, {"name": "1249", "dtype": "float32"}, {"name": "1250", "dtype": "float32"}, {"name": "1251", "dtype": "float32"}, {"name": "1252", "dtype": "float32"}, {"name": "1253", "dtype": "float32"}, {"name": "1254", "dtype": "float32"}, {"name": "1255", "dtype": "float32"}, {"name": "1256", "dtype": "float32"}, {"name": "1257", "dtype": "float32"}, {"name": "1258", "dtype": "float32"}, {"name": "1259", "dtype": "float32"}, {"name": "1260", "dtype": "float32"}, {"name": "1261", "dtype": "float32"}, {"name": "1262", "dtype": "float32"}, {"name": "1263", "dtype": "float32"}, {"name": "1264", "dtype": "float32"}, {"name": "1265", "dtype": "float32"}, {"name": "1266", "dtype": "float32"}, {"name": "1267", "dtype": "float32"}, {"name": "1268", "dtype": "float32"}, {"name": "1269", "dtype": "float32"}, {"name": "1270", "dtype": "float32"}, {"name": "1271", "dtype": "float32"}, {"name": "1272", "dtype": "float32"}, {"name": "1273", "dtype": "float32"}, {"name": "1274", "dtype": "float32"}, {"name": "1275", "dtype": "float32"}, {"name": "1276", "dtype": "float32"}, {"name": "1277", "dtype": "float32"}, {"name": "1278", "dtype": "float32"}, {"name": "1279", "dtype": "float32"}, {"name": "1280", "dtype": "float32"}, {"name": "1281", "dtype": "float32"}, {"name": "1282", "dtype": "float32"}, {"name": "1283", "dtype": "float32"}, {"name": "1284", "dtype": "float32"}, {"name": "1285", "dtype": "float32"}, {"name": "1286", "dtype": "float32"}, {"name": "1287", "dtype": "float32"}, {"name": "1288", "dtype": "float32"}, {"name": "1289", "dtype": "float32"}, {"name": "1290", "dtype": "float32"}, {"name": "1291", "dtype": "float32"}, {"name": "1292", "dtype": "float32"}, {"name": "1293", "dtype": "float32"}, {"name": "1294", "dtype": "float32"}, {"name": "1295", "dtype": "float32"}, {"name": "1296", "dtype": "float32"}, {"name": "1297", "dtype": "float32"}, {"name": "1298", "dtype": "float32"}, {"name": "1299", "dtype": "float32"}, {"name": "1300", "dtype": "float32"}, {"name": "1301", "dtype": "float32"}, {"name": "1302", "dtype": "float32"}, {"name": "1303", "dtype": "float32"}, {"name": "1304", "dtype": "float32"}, {"name": "1305", "dtype": "float32"}, {"name": "1306", "dtype": "float32"}, {"name": "1307", "dtype": "float32"}, {"name": "1308", "dtype": "float32"}, {"name": "1309", "dtype": "float32"}, {"name": "1310", "dtype": "float32"}, {"name": "1311", "dtype": "float32"}, {"name": "1312", "dtype": "float32"}, {"name": "1313", "dtype": "float32"}, {"name": "1314", "dtype": "float32"}, {"name": "1315", "dtype": "float32"}, {"name": "1316", "dtype": "float32"}, {"name": "1317", "dtype": "float32"}, {"name": "1318", "dtype": "float32"}, {"name": "1319", "dtype": "float32"}, {"name": "1320", "dtype": "float32"}, {"name": "1321", "dtype": "float32"}, {"name": "1322", "dtype": "float32"}, {"name": "1323", "dtype": "float32"}, {"name": "1324", "dtype": "float32"}, {"name": "1325", "dtype": "float32"}, {"name": "1326", "dtype": "float32"}, {"name": "1327", "dtype": "float32"}, {"name": "1328", "dtype": "float32"}, {"name": "1329", "dtype": "float32"}, {"name": "1330", "dtype": "float32"}, {"name": "1331", "dtype": "float32"}, {"name": "1332", "dtype": "float32"}, {"name": "1333", "dtype": "float32"}, {"name": "1334", "dtype": "float32"}, {"name": "1335", "dtype": "float32"}, {"name": "1336", "dtype": "float32"}, {"name": "1337", "dtype": "float32"}, {"name": "1338", "dtype": "float32"}, {"name": "1339", "dtype": "float32"}, {"name": "1340", "dtype": "float32"}, {"name": "1341", "dtype": "float32"}, {"name": "1342", "dtype": "float32"}, {"name": "1343", "dtype": "float32"}, {"name": "1344", "dtype": "float32"}, {"name": "1345", "dtype": "float32"}, {"name": "1346", "dtype": "float32"}, {"name": "1347", "dtype": "float32"}, {"name": "1348", "dtype": "float32"}, {"name": "1349", "dtype": "float32"}, {"name": "1350", "dtype": "float32"}, {"name": "1351", "dtype": "float32"}, {"name": "1352", "dtype": "float32"}, {"name": "1353", "dtype": "float32"}, {"name": "1354", "dtype": "float32"}, {"name": "1355", "dtype": "float32"}, {"name": "1356", "dtype": "float32"}, {"name": "1357", "dtype": "float32"}, {"name": "1358", "dtype": "float32"}, {"name": "1359", "dtype": "float32"}, {"name": "1360", "dtype": "float32"}, {"name": "1361", "dtype": "float32"}, {"name": "1362", "dtype": "float32"}, {"name": "1363", "dtype": "float32"}, {"name": "1364", "dtype": "float32"}, {"name": "1365", "dtype": "float32"}, {"name": "1366", "dtype": "float32"}, {"name": "1367", "dtype": "float32"}, {"name": "1368", "dtype": "float32"}, {"name": "1369", "dtype": "float32"}, {"name": "1370", "dtype": "float32"}, {"name": "1371", "dtype": "float32"}, {"name": "1372", "dtype": "float32"}, {"name": "1373", "dtype": "float32"}, {"name": "1374", "dtype": "float32"}, {"name": "1375", "dtype": "float32"}, {"name": "1376", "dtype": "float32"}, {"name": "1377", "dtype": "float32"}, {"name": "1378", "dtype": "float32"}, {"name": "1379", "dtype": "float32"}, {"name": "1380", "dtype": "float32"}, {"name": "1381", "dtype": "float32"}, {"name": "1382", "dtype": "float32"}, {"name": "1383", "dtype": "float32"}, {"name": "1384", "dtype": "float32"}, {"name": "1385", "dtype": "float32"}, {"name": "1386", "dtype": "float32"}, {"name": "1387", "dtype": "float32"}, {"name": "1388", "dtype": "float32"}, {"name": "1389", "dtype": "float32"}, {"name": "1390", "dtype": "float32"}, {"name": "1391", "dtype": "float32"}, {"name": "1392", "dtype": "float32"}, {"name": "1393", "dtype": "float32"}, {"name": "1394", "dtype": "float32"}, {"name": "1395", "dtype": "float32"}, {"name": "1396", "dtype": "float32"}, {"name": "1397", "dtype": "float32"}, {"name": "1398", "dtype": "float32"}, {"name": "1399", "dtype": "float32"}, {"name": "1400", "dtype": "float32"}, {"name": "1401", "dtype": "float32"}, {"name": "1402", "dtype": "float32"}, {"name": "1403", "dtype": "float32"}, {"name": "1404", "dtype": "float32"}, {"name": "1405", "dtype": "float32"}, {"name": "1406", "dtype": "float32"}, {"name": "1407", "dtype": "float32"}, {"name": "1408", "dtype": "float32"}, {"name": "1409", "dtype": "float32"}, {"name": "1410", "dtype": "float32"}, {"name": "1411", "dtype": "float32"}, {"name": "1412", "dtype": "float32"}, {"name": "1413", "dtype": "float32"}, {"name": "1414", "dtype": "float32"}, {"name": "1415", "dtype": "float32"}, {"name": "1416", "dtype": "float32"}, {"name": "1417", "dtype": "float32"}, {"name": "1418", "dtype": "float32"}, {"name": "1419", "dtype": "float32"}, {"name": "1420", "dtype": "float32"}, {"name": "1421", "dtype": "float32"}, {"name": "1422", "dtype": "float32"}, {"name": "1423", "dtype": "float32"}, {"name": "1424", "dtype": "float32"}, {"name": "1425", "dtype": "float32"}, {"name": "1426", "dtype": "float32"}, {"name": "1427", "dtype": "float32"}, {"name": "1428", "dtype": "float32"}, {"name": "1429", "dtype": "float32"}, {"name": "1430", "dtype": "float32"}, {"name": "1431", "dtype": "float32"}, {"name": "1432", "dtype": "float32"}, {"name": "1433", "dtype": "float32"}, {"name": "1434", "dtype": "float32"}, {"name": "1435", "dtype": "float32"}, {"name": "1436", "dtype": "float32"}, {"name": "1437", "dtype": "float32"}, {"name": "1438", "dtype": "float32"}, {"name": "1439", "dtype": "float32"}, {"name": "1440", "dtype": "float32"}, {"name": "1441", "dtype": "float32"}, {"name": "1442", "dtype": "float32"}, {"name": "1443", "dtype": "float32"}, {"name": "1444", "dtype": "float32"}, {"name": "1445", "dtype": "float32"}, {"name": "1446", "dtype": "float32"}, {"name": "1447", "dtype": "float32"}, {"name": "1448", "dtype": "float32"}, {"name": "1449", "dtype": "float32"}, {"name": "1450", "dtype": "float32"}, {"name": "1451", "dtype": "float32"}, {"name": "1452", "dtype": "float32"}, {"name": "1453", "dtype": "float32"}, {"name": "1454", "dtype": "float32"}, {"name": "1455", "dtype": "float32"}, {"name": "1456", "dtype": "float32"}, {"name": "1457", "dtype": "float32"}, {"name": "1458", "dtype": "float32"}, {"name": "1459", "dtype": "float32"}, {"name": "1460", "dtype": "float32"}, {"name": "1461", "dtype": "float32"}, {"name": "1462", "dtype": "float32"}, {"name": "1463", "dtype": "float32"}, {"name": "1464", "dtype": "float32"}, {"name": "1465", "dtype": "float32"}, {"name": "1466", "dtype": "float32"}, {"name": "1467", "dtype": "float32"}, {"name": "1468", "dtype": "float32"}, {"name": "1469", "dtype": "float32"}, {"name": "1470", "dtype": "float32"}, {"name": "1471", "dtype": "float32"}, {"name": "1472", "dtype": "float32"}, {"name": "1473", "dtype": "float32"}, {"name": "1474", "dtype": "float32"}, {"name": "1475", "dtype": "float32"}, {"name": "1476", "dtype": "float32"}, {"name": "1477", "dtype": "float32"}, {"name": "1478", "dtype": "float32"}, {"name": "1479", "dtype": "float32"}, {"name": "1480", "dtype": "float32"}, {"name": "1481", "dtype": "float32"}, {"name": "1482", "dtype": "float32"}, {"name": "1483", "dtype": "float32"}, {"name": "1484", "dtype": "float32"}, {"name": "1485", "dtype": "float32"}, {"name": "1486", "dtype": "float32"}, {"name": "1487", "dtype": "float32"}, {"name": "1488", "dtype": "float32"}, {"name": "1489", "dtype": "float32"}, {"name": "1490", "dtype": "float32"}, {"name": "1491", "dtype": "float32"}, {"name": "1492", "dtype": "float32"}, {"name": "1493", "dtype": "float32"}, {"name": "1494", "dtype": "float32"}, {"name": "1495", "dtype": "float32"}, {"name": "1496", "dtype": "float32"}, {"name": "1497", "dtype": "float32"}, {"name": "1498", "dtype": "float32"}, {"name": "1499", "dtype": "float32"}, {"name": "1500", "dtype": "float32"}, {"name": "1501", "dtype": "float32"}, {"name": "1502", "dtype": "float32"}, {"name": "1503", "dtype": "float32"}, {"name": "1504", "dtype": "float32"}, {"name": "1505", "dtype": "float32"}, {"name": "1506", "dtype": "float32"}, {"name": "1507", "dtype": "float32"}, {"name": "1508", "dtype": "float32"}, {"name": "1509", "dtype": "float32"}, {"name": "1510", "dtype": "float32"}, {"name": "1511", "dtype": "float32"}, {"name": "1512", "dtype": "float32"}, {"name": "1513", "dtype": "float32"}, {"name": "1514", "dtype": "float32"}, {"name": "1515", "dtype": "float32"}, {"name": "1516", "dtype": "float32"}, {"name": "1517", "dtype": "float32"}, {"name": "1518", "dtype": "float32"}, {"name": "1519", "dtype": "float32"}, {"name": "1520", "dtype": "float32"}, {"name": "1521", "dtype": "float32"}, {"name": "1522", "dtype": "float32"}, {"name": "1523", "dtype": "float32"}, {"name": "1524", "dtype": "float32"}, {"name": "1525", "dtype": "float32"}, {"name": "1526", "dtype": "float32"}, {"name": "1527", "dtype": "float32"}, {"name": "1528", "dtype": "float32"}, {"name": "1529", "dtype": "float32"}, {"name": "1530", "dtype": "float32"}, {"name": "1531", "dtype": "float32"}, {"name": "1532", "dtype": "float32"}, {"name": "1533", "dtype": "float32"}, {"name": "1534", "dtype": "float32"}, {"name": "1535", "dtype": "float32"}, {"name": "1536", "dtype": "float32"}, {"name": "1537", "dtype": "float32"}, {"name": "1538", "dtype": "float32"}, {"name": "1539", "dtype": "float32"}, {"name": "1540", "dtype": "float32"}, {"name": "1541", "dtype": "float32"}, {"name": "1542", "dtype": "float32"}, {"name": "1543", "dtype": "float32"}, {"name": "1544", "dtype": "float32"}, {"name": "1545", "dtype": "float32"}, {"name": "1546", "dtype": "float32"}, {"name": "1547", "dtype": "float32"}, {"name": "1548", "dtype": "float32"}, {"name": "1549", "dtype": "float32"}, {"name": "1550", "dtype": "float32"}, {"name": "1551", "dtype": "float32"}, {"name": "1552", "dtype": "float32"}, {"name": "1553", "dtype": "float32"}, {"name": "1554", "dtype": "float32"}, {"name": "1555", "dtype": "float32"}, {"name": "1556", "dtype": "float32"}, {"name": "1557", "dtype": "float32"}, {"name": "1558", "dtype": "float32"}, {"name": "1559", "dtype": "float32"}, {"name": "1560", "dtype": "float32"}, {"name": "1561", "dtype": "float32"}, {"name": "1562", "dtype": "float32"}, {"name": "1563", "dtype": "float32"}, {"name": "1564", "dtype": "float32"}, {"name": "1565", "dtype": "float32"}, {"name": "1566", "dtype": "float32"}, {"name": "1567", "dtype": "float32"}, {"name": "1568", "dtype": "float32"}, {"name": "1569", "dtype": "float32"}, {"name": "1570", "dtype": "float32"}, {"name": "1571", "dtype": "float32"}, {"name": "1572", "dtype": "float32"}, {"name": "1573", "dtype": "float32"}, {"name": "1574", "dtype": "float32"}, {"name": "1575", "dtype": "float32"}, {"name": "1576", "dtype": "float32"}, {"name": "1577", "dtype": "float32"}, {"name": "1578", "dtype": "float32"}, {"name": "1579", "dtype": "float32"}, {"name": "1580", "dtype": "float32"}, {"name": "1581", "dtype": "float32"}, {"name": "1582", "dtype": "float32"}, {"name": "1583", "dtype": "float32"}, {"name": "1584", "dtype": "float32"}, {"name": "1585", "dtype": "float32"}, {"name": "1586", "dtype": "float32"}, {"name": "1587", "dtype": "float32"}, {"name": "1588", "dtype": "float32"}, {"name": "1589", "dtype": "float32"}, {"name": "1590", "dtype": "float32"}, {"name": "1591", "dtype": "float32"}, {"name": "1592", "dtype": "float32"}, {"name": "1593", "dtype": "float32"}, {"name": "1594", "dtype": "float32"}, {"name": "1595", "dtype": "float32"}, {"name": "1596", "dtype": "float32"}, {"name": "1597", "dtype": "float32"}, {"name": "1598", "dtype": "float32"}, {"name": "1599", "dtype": "float32"}, {"name": "1600", "dtype": "float32"}, {"name": "1601", "dtype": "float32"}, {"name": "1602", "dtype": "float32"}, {"name": "1603", "dtype": "float32"}, {"name": "1604", "dtype": "float32"}, {"name": "1605", "dtype": "float32"}, {"name": "1606", "dtype": "float32"}, {"name": "1607", "dtype": "float32"}, {"name": "1608", "dtype": "float32"}, {"name": "1609", "dtype": "float32"}, {"name": "1610", "dtype": "float32"}, {"name": "1611", "dtype": "float32"}, {"name": "1612", "dtype": "float32"}, {"name": "1613", "dtype": "float32"}, {"name": "1614", "dtype": "float32"}, {"name": "1615", "dtype": "float32"}, {"name": "1616", "dtype": "float32"}, {"name": "1617", "dtype": "float32"}, {"name": "1618", "dtype": "float32"}, {"name": "1619", "dtype": "float32"}, {"name": "1620", "dtype": "float32"}, {"name": "1621", "dtype": "float32"}, {"name": "1622", "dtype": "float32"}, {"name": "1623", "dtype": "float32"}, {"name": "1624", "dtype": "float32"}, {"name": "1625", "dtype": "float32"}, {"name": "1626", "dtype": "float32"}, {"name": "1627", "dtype": "float32"}, {"name": "1628", "dtype": "float32"}, {"name": "1629", "dtype": "float32"}, {"name": "1630", "dtype": "float32"}, {"name": "1631", "dtype": "float32"}, {"name": "1632", "dtype": "float32"}, {"name": "1633", "dtype": "float32"}, {"name": "1634", "dtype": "float32"}, {"name": "1635", "dtype": "float32"}, {"name": "1636", "dtype": "float32"}, {"name": "1637", "dtype": "float32"}, {"name": "1638", "dtype": "float32"}, {"name": "1639", "dtype": "float32"}, {"name": "1640", "dtype": "float32"}, {"name": "1641", "dtype": "float32"}, {"name": "1642", "dtype": "float32"}, {"name": "1643", "dtype": "float32"}, {"name": "1644", "dtype": "float32"}, {"name": "1645", "dtype": "float32"}, {"name": "1646", "dtype": "float32"}, {"name": "1647", "dtype": "float32"}, {"name": "1648", "dtype": "float32"}, {"name": "1649", "dtype": "float32"}, {"name": "1650", "dtype": "float32"}, {"name": "1651", "dtype": "float32"}, {"name": "1652", "dtype": "float32"}, {"name": "1653", "dtype": "float32"}, {"name": "1654", "dtype": "float32"}, {"name": "1655", "dtype": "float32"}, {"name": "1656", "dtype": "float32"}, {"name": "1657", "dtype": "float32"}, {"name": "1658", "dtype": "float32"}, {"name": "1659", "dtype": "float32"}, {"name": "1660", "dtype": "float32"}, {"name": "1661", "dtype": "float32"}, {"name": "1662", "dtype": "float32"}, {"name": "1663", "dtype": "float32"}, {"name": "1664", "dtype": "float32"}, {"name": "1665", "dtype": "float32"}, {"name": "1666", "dtype": "float32"}, {"name": "1667", "dtype": "float32"}, {"name": "1668", "dtype": "float32"}, {"name": "1669", "dtype": "float32"}, {"name": "1670", "dtype": "float32"}, {"name": "1671", "dtype": "float32"}, {"name": "1672", "dtype": "float32"}, {"name": "1673", "dtype": "float32"}, {"name": "1674", "dtype": "float32"}, {"name": "1675", "dtype": "float32"}, {"name": "1676", "dtype": "float32"}, {"name": "1677", "dtype": "float32"}, {"name": "1678", "dtype": "float32"}, {"name": "1679", "dtype": "float32"}, {"name": "1680", "dtype": "float32"}, {"name": "1681", "dtype": "float32"}, {"name": "1682", "dtype": "float32"}, {"name": "1683", "dtype": "float32"}, {"name": "1684", "dtype": "float32"}, {"name": "1685", "dtype": "float32"}, {"name": "1686", "dtype": "float32"}, {"name": "1687", "dtype": "float32"}, {"name": "1688", "dtype": "float32"}, {"name": "1689", "dtype": "float32"}, {"name": "1690", "dtype": "float32"}, {"name": "1691", "dtype": "float32"}, {"name": "1692", "dtype": "float32"}, {"name": "1693", "dtype": "float32"}, {"name": "1694", "dtype": "float32"}, {"name": "1695", "dtype": "float32"}, {"name": "1696", "dtype": "float32"}, {"name": "1697", "dtype": "float32"}, {"name": "1698", "dtype": "float32"}, {"name": "1699", "dtype": "float32"}, {"name": "1700", "dtype": "float32"}, {"name": "1701", "dtype": "float32"}, {"name": "1702", "dtype": "float32"}, {"name": "1703", "dtype": "float32"}, {"name": "1704", "dtype": "float32"}, {"name": "1705", "dtype": "float32"}, {"name": "1706", "dtype": "float32"}, {"name": "1707", "dtype": "float32"}, {"name": "1708", "dtype": "float32"}, {"name": "1709", "dtype": "float32"}, {"name": "1710", "dtype": "float32"}, {"name": "1711", "dtype": "float32"}, {"name": "1712", "dtype": "float32"}, {"name": "1713", "dtype": "float32"}, {"name": "1714", "dtype": "float32"}, {"name": "1715", "dtype": "float32"}, {"name": "1716", "dtype": "float32"}, {"name": "1717", "dtype": "float32"}, {"name": "1718", "dtype": "float32"}, {"name": "1719", "dtype": "float32"}, {"name": "1720", "dtype": "float32"}, {"name": "1721", "dtype": "float32"}, {"name": "1722", "dtype": "float32"}, {"name": "1723", "dtype": "float32"}, {"name": "1724", "dtype": "float32"}, {"name": "1725", "dtype": "float32"}, {"name": "1726", "dtype": "float32"}, {"name": "1727", "dtype": "float32"}, {"name": "1728", "dtype": "float32"}, {"name": "1729", "dtype": "float32"}, {"name": "1730", "dtype": "float32"}, {"name": "1731", "dtype": "float32"}, {"name": "1732", "dtype": "float32"}, {"name": "1733", "dtype": "float32"}, {"name": "1734", "dtype": "float32"}, {"name": "1735", "dtype": "float32"}, {"name": "1736", "dtype": "float32"}, {"name": "1737", "dtype": "float32"}, {"name": "1738", "dtype": "float32"}, {"name": "1739", "dtype": "float32"}, {"name": "1740", "dtype": "float32"}, {"name": "1741", "dtype": "float32"}, {"name": "1742", "dtype": "float32"}, {"name": "1743", "dtype": "float32"}, {"name": "1744", "dtype": "float32"}, {"name": "1745", "dtype": "float32"}, {"name": "1746", "dtype": "float32"}, {"name": "1747", "dtype": "float32"}, {"name": "1748", "dtype": "float32"}, {"name": "1749", "dtype": "float32"}, {"name": "1750", "dtype": "float32"}, {"name": "1751", "dtype": "float32"}, {"name": "1752", "dtype": "float32"}, {"name": "1753", "dtype": "float32"}, {"name": "1754", "dtype": "float32"}, {"name": "1755", "dtype": "float32"}, {"name": "1756", "dtype": "float32"}, {"name": "1757", "dtype": "float32"}, {"name": "1758", "dtype": "float32"}, {"name": "1759", "dtype": "float32"}, {"name": "1760", "dtype": "float32"}, {"name": "1761", "dtype": "float32"}, {"name": "1762", "dtype": "float32"}, {"name": "1763", "dtype": "float32"}, {"name": "1764", "dtype": "float32"}, {"name": "1765", "dtype": "float32"}, {"name": "1766", "dtype": "float32"}, {"name": "1767", "dtype": "float32"}, {"name": "1768", "dtype": "float32"}, {"name": "1769", "dtype": "float32"}, {"name": "1770", "dtype": "float32"}, {"name": "1771", "dtype": "float32"}, {"name": "1772", "dtype": "float32"}, {"name": "1773", "dtype": "float32"}, {"name": "1774", "dtype": "float32"}, {"name": "1775", "dtype": "float32"}, {"name": "1776", "dtype": "float32"}, {"name": "1777", "dtype": "float32"}, {"name": "1778", "dtype": "float32"}, {"name": "1779", "dtype": "float32"}, {"name": "1780", "dtype": "float32"}, {"name": "1781", "dtype": "float32"}, {"name": "1782", "dtype": "float32"}, {"name": "1783", "dtype": "float32"}, {"name": "1784", "dtype": "float32"}, {"name": "1785", "dtype": "float32"}, {"name": "1786", "dtype": "float32"}, {"name": "1787", "dtype": "float32"}, {"name": "1788", "dtype": "float32"}, {"name": "1789", "dtype": "float32"}, {"name": "1790", "dtype": "float32"}, {"name": "1791", "dtype": "float32"}, {"name": "1792", "dtype": "float32"}, {"name": "1793", "dtype": "float32"}, {"name": "1794", "dtype": "float32"}, {"name": "1795", "dtype": "float32"}, {"name": "1796", "dtype": "float32"}, {"name": "1797", "dtype": "float32"}, {"name": "1798", "dtype": "float32"}, {"name": "1799", "dtype": "float32"}, {"name": "1800", "dtype": "float32"}, {"name": "1801", "dtype": "float32"}, {"name": "1802", "dtype": "float32"}, {"name": "1803", "dtype": "float32"}, {"name": "1804", "dtype": "float32"}, {"name": "1805", "dtype": "float32"}, {"name": "1806", "dtype": "float32"}, {"name": "1807", "dtype": "float32"}, {"name": "1808", "dtype": "float32"}, {"name": "1809", "dtype": "float32"}, {"name": "1810", "dtype": "float32"}, {"name": "1811", "dtype": "float32"}, {"name": "1812", "dtype": "float32"}, {"name": "1813", "dtype": "float32"}, {"name": "1814", "dtype": "float32"}, {"name": "1815", "dtype": "float32"}, {"name": "1816", "dtype": "float32"}, {"name": "1817", "dtype": "float32"}, {"name": "1818", "dtype": "float32"}, {"name": "1819", "dtype": "float32"}, {"name": "1820", "dtype": "float32"}, {"name": "1821", "dtype": "float32"}, {"name": "1822", "dtype": "float32"}, {"name": "1823", "dtype": "float32"}, {"name": "1824", "dtype": "float32"}, {"name": "1825", "dtype": "float32"}, {"name": "1826", "dtype": "float32"}, {"name": "1827", "dtype": "float32"}, {"name": "1828", "dtype": "float32"}, {"name": "1829", "dtype": "float32"}, {"name": "1830", "dtype": "float32"}, {"name": "1831", "dtype": "float32"}, {"name": "1832", "dtype": "float32"}, {"name": "1833", "dtype": "float32"}, {"name": "1834", "dtype": "float32"}, {"name": "1835", "dtype": "float32"}, {"name": "1836", "dtype": "float32"}, {"name": "1837", "dtype": "float32"}, {"name": "1838", "dtype": "float32"}, {"name": "1839", "dtype": "float32"}, {"name": "1840", "dtype": "float32"}, {"name": "1841", "dtype": "float32"}, {"name": "1842", "dtype": "float32"}, {"name": "1843", "dtype": "float32"}, {"name": "1844", "dtype": "float32"}, {"name": "1845", "dtype": "float32"}, {"name": "1846", "dtype": "float32"}, {"name": "1847", "dtype": "float32"}, {"name": "1848", "dtype": "float32"}, {"name": "1849", "dtype": "float32"}, {"name": "1850", "dtype": "float32"}, {"name": "1851", "dtype": "float32"}, {"name": "1852", "dtype": "float32"}, {"name": "1853", "dtype": "float32"}, {"name": "1854", "dtype": "float32"}, {"name": "1855", "dtype": "float32"}, {"name": "1856", "dtype": "float32"}, {"name": "1857", "dtype": "float32"}, {"name": "1858", "dtype": "float32"}, {"name": "1859", "dtype": "float32"}, {"name": "1860", "dtype": "float32"}, {"name": "1861", "dtype": "float32"}, {"name": "1862", "dtype": "float32"}, {"name": "1863", "dtype": "float32"}, {"name": "1864", "dtype": "float32"}, {"name": "1865", "dtype": "float32"}, {"name": "1866", "dtype": "float32"}, {"name": "1867", "dtype": "float32"}, {"name": "1868", "dtype": "float32"}, {"name": "1869", "dtype": "float32"}, {"name": "1870", "dtype": "float32"}, {"name": "1871", "dtype": "float32"}, {"name": "1872", "dtype": "float32"}, {"name": "1873", "dtype": "float32"}, {"name": "1874", "dtype": "float32"}, {"name": "1875", "dtype": "float32"}, {"name": "1876", "dtype": "float32"}, {"name": "1877", "dtype": "float32"}, {"name": "1878", "dtype": "float32"}, {"name": "1879", "dtype": "float32"}, {"name": "1880", "dtype": "float32"}, {"name": "1881", "dtype": "float32"}, {"name": "1882", "dtype": "float32"}, {"name": "1883", "dtype": "float32"}, {"name": "1884", "dtype": "float32"}, {"name": "1885", "dtype": "float32"}, {"name": "1886", "dtype": "float32"}, {"name": "1887", "dtype": "float32"}, {"name": "1888", "dtype": "float32"}, {"name": "1889", "dtype": "float32"}, {"name": "1890", "dtype": "float32"}, {"name": "1891", "dtype": "float32"}, {"name": "1892", "dtype": "float32"}, {"name": "1893", "dtype": "float32"}, {"name": "1894", "dtype": "float32"}, {"name": "1895", "dtype": "float32"}, {"name": "1896", "dtype": "float32"}, {"name": "1897", "dtype": "float32"}, {"name": "1898", "dtype": "float32"}, {"name": "1899", "dtype": "float32"}, {"name": "1900", "dtype": "float32"}, {"name": "1901", "dtype": "float32"}, {"name": "1902", "dtype": "float32"}, {"name": "1903", "dtype": "float32"}, {"name": "1904", "dtype": "float32"}, {"name": "1905", "dtype": "float32"}, {"name": "1906", "dtype": "float32"}, {"name": "1907", "dtype": "float32"}, {"name": "1908", "dtype": "float32"}, {"name": "1909", "dtype": "float32"}, {"name": "1910", "dtype": "float32"}, {"name": "1911", "dtype": "float32"}, {"name": "1912", "dtype": "float32"}, {"name": "1913", "dtype": "float32"}, {"name": "1914", "dtype": "float32"}, {"name": "1915", "dtype": "float32"}, {"name": "1916", "dtype": "float32"}, {"name": "1917", "dtype": "float32"}, {"name": "1918", "dtype": "float32"}, {"name": "1919", "dtype": "float32"}, {"name": "1920", "dtype": "float32"}, {"name": "1921", "dtype": "float32"}, {"name": "1922", "dtype": "float32"}, {"name": "1923", "dtype": "float32"}, {"name": "1924", "dtype": "float32"}, {"name": "1925", "dtype": "float32"}, {"name": "1926", "dtype": "float32"}, {"name": "1927", "dtype": "float32"}, {"name": "1928", "dtype": "float32"}, {"name": "1929", "dtype": "float32"}, {"name": "1930", "dtype": "float32"}, {"name": "1931", "dtype": "float32"}, {"name": "1932", "dtype": "float32"}, {"name": "1933", "dtype": "float32"}, {"name": "1934", "dtype": "float32"}, {"name": "1935", "dtype": "float32"}, {"name": "1936", "dtype": "float32"}, {"name": "1937", "dtype": "float32"}, {"name": "1938", "dtype": "float32"}, {"name": "1939", "dtype": "float32"}, {"name": "1940", "dtype": "float32"}, {"name": "1941", "dtype": "float32"}, {"name": "1942", "dtype": "float32"}, {"name": "1943", "dtype": "float32"}, {"name": "1944", "dtype": "float32"}, {"name": "1945", "dtype": "float32"}, {"name": "1946", "dtype": "float32"}, {"name": "1947", "dtype": "float32"}, {"name": "1948", "dtype": "float32"}, {"name": "1949", "dtype": "float32"}, {"name": "1950", "dtype": "float32"}, {"name": "1951", "dtype": "float32"}, {"name": "1952", "dtype": "float32"}, {"name": "1953", "dtype": "float32"}, {"name": "1954", "dtype": "float32"}, {"name": "1955", "dtype": "float32"}, {"name": "1956", "dtype": "float32"}, {"name": "1957", "dtype": "float32"}, {"name": "1958", "dtype": "float32"}, {"name": "1959", "dtype": "float32"}, {"name": "1960", "dtype": "float32"}, {"name": "1961", "dtype": "float32"}, {"name": "1962", "dtype": "float32"}, {"name": "1963", "dtype": "float32"}, {"name": "1964", "dtype": "float32"}, {"name": "1965", "dtype": "float32"}, {"name": "1966", "dtype": "float32"}, {"name": "1967", "dtype": "float32"}, {"name": "1968", "dtype": "float32"}, {"name": "1969", "dtype": "float32"}, {"name": "1970", "dtype": "float32"}, {"name": "1971", "dtype": "float32"}, {"name": "1972", "dtype": "float32"}, {"name": "1973", "dtype": "float32"}, {"name": "1974", "dtype": "float32"}, {"name": "1975", "dtype": "float32"}, {"name": "1976", "dtype": "float32"}, {"name": "1977", "dtype": "float32"}, {"name": "1978", "dtype": "float32"}, {"name": "1979", "dtype": "float32"}, {"name": "1980", "dtype": "float32"}, {"name": "1981", "dtype": "float32"}, {"name": "1982", "dtype": "float32"}, {"name": "1983", "dtype": "float32"}, {"name": "1984", "dtype": "float32"}, {"name": "1985", "dtype": "float32"}, {"name": "1986", "dtype": "float32"}, {"name": "1987", "dtype": "float32"}, {"name": "1988", "dtype": "float32"}, {"name": "1989", "dtype": "float32"}, {"name": "1990", "dtype": "float32"}, {"name": "1991", "dtype": "float32"}, {"name": "1992", "dtype": "float32"}, {"name": "1993", "dtype": "float32"}, {"name": "1994", "dtype": "float32"}, {"name": "1995", "dtype": "float32"}, {"name": "1996", "dtype": "float32"}, {"name": "1997", "dtype": "float32"}, {"name": "1998", "dtype": "float32"}, {"name": "1999", "dtype": "float32"}, {"name": "2000", "dtype": "float32"}, {"name": "2001", "dtype": "float32"}, {"name": "2002", "dtype": "float32"}, {"name": "2003", "dtype": "float32"}, {"name": "2004", "dtype": "float32"}, {"name": "2005", "dtype": "float32"}, {"name": "2006", "dtype": "float32"}, {"name": "2007", "dtype": "float32"}, {"name": "2008", "dtype": "float32"}, {"name": "2009", "dtype": "float32"}, {"name": "2010", "dtype": "float32"}, {"name": "2011", "dtype": "float32"}, {"name": "2012", "dtype": "float32"}, {"name": "2013", "dtype": "float32"}, {"name": "2014", "dtype": "float32"}, {"name": "2015", "dtype": "float32"}, {"name": "2016", "dtype": "float32"}, {"name": "2017", "dtype": "float32"}, {"name": "2018", "dtype": "float32"}, {"name": "2019", "dtype": "float32"}, {"name": "2020", "dtype": "float32"}, {"name": "2021", "dtype": "float32"}, {"name": "2022", "dtype": "float32"}, {"name": "2023", "dtype": "float32"}, {"name": "2024", "dtype": "float32"}, {"name": "2025", "dtype": "float32"}, {"name": "2026", "dtype": "float32"}, {"name": "2027", "dtype": "float32"}, {"name": "2028", "dtype": "float32"}, {"name": "2029", "dtype": "float32"}, {"name": "2030", "dtype": "float32"}, {"name": "2031", "dtype": "float32"}, {"name": "2032", "dtype": "float32"}, {"name": "2033", "dtype": "float32"}, {"name": "2034", "dtype": "float32"}, {"name": "2035", "dtype": "float32"}, {"name": "2036", "dtype": "float32"}, {"name": "2037", "dtype": "float32"}, {"name": "2038", "dtype": "float32"}, {"name": "2039", "dtype": "float32"}, {"name": "2040", "dtype": "float32"}, {"name": "2041", "dtype": "float32"}, {"name": "2042", "dtype": "float32"}, {"name": "2043", "dtype": "float32"}, {"name": "2044", "dtype": "float32"}, {"name": "2045", "dtype": "float32"}, {"name": "2046", "dtype": "float32"}, {"name": "2047", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 213730620.21618995, "num_examples": 26057}, {"name": "test", "num_bytes": 71246407.07358725, "num_examples": 8686}], "download_size": 392449335, "dataset_size": 284977027.2897772}}
|
2023-08-23T03:41:25+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AA_GPTNEO_Finetuned"
More Information needed
|
[
"# Dataset Card for \"AA_GPTNEO_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AA_GPTNEO_Finetuned\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AA_GPTNEO_Finetuned\"\n\nMore Information needed"
] |
1642a8722f7df653fac9669cbd7b909c8d7e005d
|
# Dataset of houjuu_nue/ε°η£γ¬γ/νΈμ₯¬λμ (Touhou)
This is the dataset of houjuu_nue/ε°η£γ¬γ/νΈμ₯¬λμ (Touhou), containing 500 images and their tags.
The core tags of this character are `black_hair, wings, asymmetrical_wings, red_eyes, short_hair, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 608.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 394.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1136 | 741.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 561.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1136 | 961.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjuu_nue_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/houjuu_nue_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, black_thighhighs, dress, solo, zettai_ryouiki, smile, snake, trident |
| 1 | 7 |  |  |  |  |  | 1girl, black_thighhighs, dress, snake, solo, trident, zettai_ryouiki |
| 2 | 11 |  |  |  |  |  | 1girl, black_thighhighs, dress, solo, zettai_ryouiki, smile, ahoge |
| 3 | 8 |  |  |  |  |  | 1girl, black_dress, black_thighhighs, blue_wings, looking_at_viewer, red_bowtie, red_wings, short_dress, short_sleeves, smile, solo, trident, bangs, center_frills, hair_between_eyes, holding_weapon, simple_background, zettai_ryouiki, ahoge, blush, medium_breasts, white_background, cowboy_shot, pointy_ears, snake, wristband, buttons, closed_mouth, open_mouth, thighs |
| 4 | 9 |  |  |  |  |  | 1girl, black_dress, black_thighhighs, solo, zettai_ryouiki, looking_at_viewer, smile, short_sleeves |
| 5 | 8 |  |  |  |  |  | 1girl, bangs, black_dress, black_thighhighs, center_frills, red_bowtie, red_wings, short_dress, short_sleeves, snake, solo, trident, blue_wings, blush, footwear_bow, full_body, holding_weapon, looking_at_viewer, red_footwear, shoes, closed_mouth, wristband, :d, open_mouth, simple_background, ufo |
| 6 | 9 |  |  |  |  |  | 1girl, black_dress, solo, looking_at_viewer, red_bowtie, simple_background, upper_body, short_sleeves, white_background |
| 7 | 5 |  |  |  |  |  | 1girl, black_thighhighs, pantyshot, solo, blush, white_panties, snake, black_dress, ufo |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_thighhighs | dress | solo | zettai_ryouiki | smile | snake | trident | ahoge | black_dress | blue_wings | looking_at_viewer | red_bowtie | red_wings | short_dress | short_sleeves | bangs | center_frills | hair_between_eyes | holding_weapon | simple_background | blush | medium_breasts | white_background | cowboy_shot | pointy_ears | wristband | buttons | closed_mouth | open_mouth | thighs | footwear_bow | full_body | red_footwear | shoes | :d | ufo | upper_body | pantyshot | white_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------|:-------|:-----------------|:--------|:--------|:----------|:--------|:--------------|:-------------|:--------------------|:-------------|:------------|:--------------|:----------------|:--------|:----------------|:--------------------|:-----------------|:--------------------|:--------|:-----------------|:-------------------|:--------------|:--------------|:------------|:----------|:---------------|:-------------|:---------|:---------------|:------------|:---------------|:--------|:-----|:------|:-------------|:------------|:----------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | X | X | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | X | | | X | X | | X | X | X | X | X | X | X | X | X | | X | X | X | | | | | X | | X | X | | X | X | X | X | X | X | | | |
| 6 | 9 |  |  |  |  |  | X | | | X | | | | | | X | | X | X | | | X | | | | | X | | | X | | | | | | | | | | | | | | X | | |
| 7 | 5 |  |  |  |  |  | X | X | | X | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | X | X |
|
CyberHarem/houjuu_nue_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T21:21:07+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T12:37:41+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of houjuu\_nue/ε°η£γ¬γ/νΈμ₯¬λμ (Touhou)
=========================================
This is the dataset of houjuu\_nue/ε°η£γ¬γ/νΈμ₯¬λμ (Touhou), containing 500 images and their tags.
The core tags of this character are 'black\_hair, wings, asymmetrical\_wings, red\_eyes, short\_hair, bow', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
05ea88af3c1503f9949e054ebb930a13d532f207
|
# Dataset Card for Evaluation run of Corianas/Quokka_2.7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/Quokka_2.7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/Quokka_2.7b](https://huggingface.co/Corianas/Quokka_2.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__Quokka_2.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T03:05:58.053951](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_2.7b/blob/main/results_2023-09-18T03-05-58.053951.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.027055369127516778,
"em_stderr": 0.0016615386418947858,
"f1": 0.0843078859060403,
"f1_stderr": 0.0021162612701253174,
"acc": 0.27932236818091244,
"acc_stderr": 0.007830181847252834
},
"harness|drop|3": {
"em": 0.027055369127516778,
"em_stderr": 0.0016615386418947858,
"f1": 0.0843078859060403,
"f1_stderr": 0.0021162612701253174
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501802
},
"harness|winogrande|5": {
"acc": 0.5548539857932123,
"acc_stderr": 0.013967662954355487
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Corianas__Quokka_2.7b
|
[
"region:us"
] |
2023-08-17T21:25:42+00:00
|
{"pretty_name": "Evaluation run of Corianas/Quokka_2.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Corianas/Quokka_2.7b](https://huggingface.co/Corianas/Quokka_2.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__Quokka_2.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T03:05:58.053951](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_2.7b/blob/main/results_2023-09-18T03-05-58.053951.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.027055369127516778,\n \"em_stderr\": 0.0016615386418947858,\n \"f1\": 0.0843078859060403,\n \"f1_stderr\": 0.0021162612701253174,\n \"acc\": 0.27932236818091244,\n \"acc_stderr\": 0.007830181847252834\n },\n \"harness|drop|3\": {\n \"em\": 0.027055369127516778,\n \"em_stderr\": 0.0016615386418947858,\n \"f1\": 0.0843078859060403,\n \"f1_stderr\": 0.0021162612701253174\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401501802\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5548539857932123,\n \"acc_stderr\": 0.013967662954355487\n }\n}\n```", "repo_url": "https://huggingface.co/Corianas/Quokka_2.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T03_05_58.053951", "path": ["**/details_harness|drop|3_2023-09-18T03-05-58.053951.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T03-05-58.053951.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T03_05_58.053951", "path": ["**/details_harness|gsm8k|5_2023-09-18T03-05-58.053951.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T03-05-58.053951.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:58:12.174583.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:58:12.174583.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:58:12.174583.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T03_05_58.053951", "path": ["**/details_harness|winogrande|5_2023-09-18T03-05-58.053951.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T03-05-58.053951.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_58_12.174583", "path": ["results_2023-07-19T15:58:12.174583.parquet"]}, {"split": "2023_09_18T03_05_58.053951", "path": ["results_2023-09-18T03-05-58.053951.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T03-05-58.053951.parquet"]}]}]}
|
2023-09-18T02:06:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Corianas/Quokka_2.7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Corianas/Quokka_2.7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-18T03:05:58.053951(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Corianas/Quokka_2.7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T03:05:58.053951(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Corianas/Quokka_2.7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T03:05:58.053951(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Corianas/Quokka_2.7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T03:05:58.053951(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
83a607f74681aef83babeccee1419f7a92d197ee
|
# Dataset of junko/η΄η (Touhou)
This is the dataset of junko/η΄η (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, blonde_hair, red_eyes, hat, breasts, bangs, very_long_hair, hair_between_eyes, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 776.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junko_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 434.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junko_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1231 | 917.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junko_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 684.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/junko_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1231 | 1.26 GiB | [Download](https://huggingface.co/datasets/CyberHarem/junko_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/junko_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, black_dress, chinese_clothes, long_sleeves, looking_at_viewer, sash, solo, tabard, wide_sleeves, crescent, orange_hair, smile |
| 1 | 31 |  |  |  |  |  | 1girl, black_dress, chinese_clothes, long_sleeves, sash, solo, tabard, wide_sleeves, fox_tail, multiple_tails, smile, looking_at_viewer, energy |
| 2 | 9 |  |  |  |  |  | 1girl, black_dress, chinese_clothes, long_sleeves, looking_at_viewer, solo, tabard, smile, white_background, wide_sleeves, sash, simple_background |
| 3 | 5 |  |  |  |  |  | 1girl, black_dress, chinese_clothes, long_sleeves, solo, tabard, wide_sleeves, smile, sash, looking_at_viewer, open_mouth |
| 4 | 35 |  |  |  |  |  | 1girl, black_dress, chinese_clothes, crescent, long_sleeves, phoenix_crown, pom_pom_(clothes), solo, tabard, looking_at_viewer, wide_sleeves, yellow_bowtie, medium_breasts, smile, standing, closed_mouth, red_vest, energy, orange_hair, brown_belt, simple_background, hands_up, white_background |
| 5 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, open_mouth, solo_focus, blush, phoenix_crown, smile, completely_nude, cum, heart, large_breasts, sex, tongue, indoors, looking_at_viewer, navel, censored, crescent, sweat |
| 6 | 7 |  |  |  |  |  | 1girl, solo, blush, navel, nipples, collarbone, completely_nude, huge_breasts, looking_at_viewer, open_mouth, phoenix_crown, lactation, large_breasts, pussy |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | chinese_clothes | long_sleeves | looking_at_viewer | sash | solo | tabard | wide_sleeves | crescent | orange_hair | smile | fox_tail | multiple_tails | energy | white_background | simple_background | open_mouth | phoenix_crown | pom_pom_(clothes) | yellow_bowtie | medium_breasts | standing | closed_mouth | red_vest | brown_belt | hands_up | 1boy | hetero | nipples | solo_focus | blush | completely_nude | cum | heart | large_breasts | sex | tongue | indoors | navel | censored | sweat | collarbone | huge_breasts | lactation | pussy |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:------------------|:---------------|:--------------------|:-------|:-------|:---------|:---------------|:-----------|:--------------|:--------|:-----------|:-----------------|:---------|:-------------------|:--------------------|:-------------|:----------------|:--------------------|:----------------|:-----------------|:-----------|:---------------|:-----------|:-------------|:-----------|:-------|:---------|:----------|:-------------|:--------|:------------------|:------|:--------|:----------------|:------|:---------|:----------|:--------|:-----------|:--------|:-------------|:---------------|:------------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 31 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 35 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | | X | | | | | X | | X | | | | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 6 | 7 |  |  |  |  |  | X | | | | X | | X | | | | | | | | | | | X | X | | | | | | | | | | | X | | X | X | | | X | | | | X | | | X | X | X | X |
|
CyberHarem/junko_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T21:53:02+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T20:15:30+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of junko/η΄η (Touhou)
============================
This is the dataset of junko/η΄η (Touhou), containing 500 images and their tags.
The core tags of this character are 'long\_hair, blonde\_hair, red\_eyes, hat, breasts, bangs, very\_long\_hair, hair\_between\_eyes, black\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
189eca49caeca92444a97d3a321fd86f2a706ead
|
# Dataset of hata_no_kokoro/秦γγγ/ννλ
Έμ½μ½λ‘ (Touhou)
This is the dataset of hata_no_kokoro/秦γγγ/ννλ
Έμ½μ½λ‘ (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, pink_hair, pink_eyes, bow, very_long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 711.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 430.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1210 | 878.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 637.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1210 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hata_no_kokoro_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, bubble_skirt, expressionless, folding_fan, long_sleeves, looking_at_viewer, noh_mask, plaid_shirt, solo, fox_mask, oni_mask |
| 1 | 5 |  |  |  |  |  | 1girl, bubble_skirt, expressionless, folding_fan, fox_mask, long_sleeves, looking_at_viewer, plaid_shirt, solo |
| 2 | 8 |  |  |  |  |  | 1girl, fox_mask, long_sleeves, looking_at_viewer, noh_mask, oni_mask, plaid_shirt, solo, bubble_skirt, expressionless, mouth_mask, wide_sleeves |
| 3 | 14 |  |  |  |  |  | 1girl, bubble_skirt, expressionless, fox_mask, long_sleeves, naginata, plaid_shirt, solo, looking_at_viewer, oni_mask, mouth_mask |
| 4 | 8 |  |  |  |  |  | 1girl, bubble_skirt, circle, closed_mouth, collared_shirt, long_sleeves, looking_at_viewer, plaid_shirt, solo, star_(symbol), triangle, buttons, green_shirt, hair_between_eyes, mask_on_head, orange_skirt, purple_bowtie, white_background, expressionless, fox_mask, simple_background, folding_fan, holding_fan, standing, blue_bowtie, pink_skirt |
| 5 | 9 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, looking_at_viewer, mask_on_head, plaid_shirt, solo, expressionless, fox_mask, hair_between_eyes, purple_bowtie, upper_body, green_shirt, collared_shirt, simple_background, star_(symbol), white_background |
| 6 | 9 |  |  |  |  |  | 1girl, long_sleeves, solo, wide_sleeves, alternate_costume, floral_print, looking_at_viewer, mask_on_head, blush, hair_between_eyes, obi, sidelocks, closed_mouth, holding, standing, expressionless, alternate_hairstyle, pink_kimono, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bubble_skirt | expressionless | folding_fan | long_sleeves | looking_at_viewer | noh_mask | plaid_shirt | solo | fox_mask | oni_mask | mouth_mask | wide_sleeves | naginata | circle | closed_mouth | collared_shirt | star_(symbol) | triangle | buttons | green_shirt | hair_between_eyes | mask_on_head | orange_skirt | purple_bowtie | white_background | simple_background | holding_fan | standing | blue_bowtie | pink_skirt | upper_body | alternate_costume | floral_print | blush | obi | sidelocks | holding | alternate_hairstyle | pink_kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------------|:--------------|:---------------|:--------------------|:-----------|:--------------|:-------|:-----------|:-----------|:-------------|:---------------|:-----------|:---------|:---------------|:-----------------|:----------------|:-----------|:----------|:--------------|:--------------------|:---------------|:---------------|:----------------|:-------------------|:--------------------|:--------------|:-----------|:--------------|:-------------|:-------------|:--------------------|:---------------|:--------|:------|:------------|:----------|:----------------------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | | X | X | | X | X | X | | | | | | X | X | X | | | X | X | X | | X | X | X | | | | | X | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | | X | X | | | X | | | | X | | | X | | | | | | X | X | | | X | | | X | | | | X | X | X | X | X | X | X | X |
|
CyberHarem/hata_no_kokoro_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T22:02:17+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T16:00:54+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of hata\_no\_kokoro/秦γγγ/ννλ
Έμ½μ½λ‘ (Touhou)
================================================
This is the dataset of hata\_no\_kokoro/秦γγγ/ννλ
Έμ½μ½λ‘ (Touhou), containing 500 images and their tags.
The core tags of this character are 'long\_hair, pink\_hair, pink\_eyes, bow, very\_long\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
5c8b3b959c92033039c0140fa754192ad21eea50
|
# Dataset Card for "PKDD_BERT_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_BERT_Baseline
|
[
"region:us"
] |
2023-08-17T22:13:11+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 38536305.0, "num_examples": 12500}], "download_size": 211879692, "dataset_size": 154145212.5}}
|
2023-08-17T22:18:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_BERT_Baseline"
More Information needed
|
[
"# Dataset Card for \"PKDD_BERT_Baseline\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_BERT_Baseline\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_BERT_Baseline\"\n\nMore Information needed"
] |
f232dea5aaf0611ad8ad01bb97dca825b68ddd08
|
# Dataset Card for "PKDD_RoBERTa_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_RoBERTa_Baseline
|
[
"region:us"
] |
2023-08-17T22:25:10+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 38536305.0, "num_examples": 12500}], "download_size": 211881539, "dataset_size": 154145212.5}}
|
2023-08-17T22:30:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_RoBERTa_Baseline"
More Information needed
|
[
"# Dataset Card for \"PKDD_RoBERTa_Baseline\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_RoBERTa_Baseline\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_RoBERTa_Baseline\"\n\nMore Information needed"
] |
eead41314b3d0d049c6e8e2ed1d4ba1130a32e7a
|
```python
import json
with open("verbalist/datasets/RyokoAI_ShareGPT52K/sg_90k_part1.json") as f:
dataset1 = json.load(f)
with open("verbalist/datasets/RyokoAI_ShareGPT52K/sg_90k_part2.json") as f:
dataset2 = json.load(f)
dataset = dataset1 + dataset2
# with open("./verbalist/datasets/openchat_sharegpt4_dataset/sharegpt_gpt4.json") as f:
# dataset = json.load(f)
conversation_field = "conversations"
import re
import regex
import hashlib
def filter_string(string):
has = True
has_zh = not len(re.findall(r"[\u4e00-\u9fff]+", string)) > 0
has_ko = not len(re.findall(r"[\u3131-\ucb4c]+", string)) > 0
# has_ar = not len(re.findall(r"[\u0621-\u64A]+", string)) > 0
has = has_zh and has_ko
invalid_letters = "ΡΡùéà çΔΔ°ΕΎΕ‘Ψ§ΩΨͺΩΨͺΨΉΨ―ΩΨ§ΨΨ―Ψ©Ω
ΩΨ£ΩΩ
Ψ§ΩΩΩΩΨ³ΩΨ§Ψ³ΩΩ"
for letter in invalid_letters:
if letter in string:
return False
return has
def has_cyrillic(text):
return bool(regex.search(r"\p{IsCyrillic}", text))
clean_dataset = []
for conversation in dataset:
all_text = "\n".join([item["value"] for item in conversation[conversation_field]])
# print(all_text)
# break
if filter_string(all_text) and not has_cyrillic(all_text):
clean_dataset.append(conversation)
import markdownify
def correct_string(string):
string = string.replace("\\_", "_")
string = string.replace("\njavascriptCopy code `", "javascript\n")
languages = [
"css",
"python",
"go",
"html",
"kotlin",
"diff",
"vba",
"sql",
"",
"javascript",
"c",
"cpp",
"sass",
'lua',
"scss",
'php'
]
for lang in languages:
string = string.replace(f"\n{lang}Copy code`", f"{lang}\n")
string = string.replace("`\n```", "\n```")
string = string.replace("\n ", "\n ")
delete_phrases = [
"ΠΠ°ΠΊ ΠΈΡΠΊΡΡΡΡΠ²Π΅Π½Π½ΡΠΉ ΠΈΠ½ΡΠ΅Π»Π»Π΅ΠΊΡ, Ρ Π½Π΅ ΡΠ²Π»ΡΡΡΡ Π²Π»Π°Π΄Π΅Π»ΡΡΠ΅ΠΌ ΡΠΈΠ·ΠΈΡΠ΅ΡΠΊΠΈΡ
ΠΎΠ±ΡΠ΅ΠΊΡΠΎΠ² ΠΈ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠΎΠ΄Π°Π²Π°ΡΡ ΠΈΠ»ΠΈ ΠΏΠΎΠΊΡΠΏΠ°ΡΡ ΠΏΡΠ΅Π΄ΠΌΠ΅ΡΡ. ΠΠ΄Π½Π°ΠΊΠΎ, Ρ ΠΌΠΎΠ³Ρ ΠΏΠΎΠ΄Π΅Π»ΠΈΡΡΡΡ ΡΠΎΠ²Π΅ΡΠΎΠΌ, ΠΊΠ°ΠΊ ΠΌΠΎΠΆΠ½ΠΎ ΠΏΠΎΠΏΡΡΠ°ΡΡΡΡ ΡΠ±Π΅Π΄ΠΈΡΡ ΠΊΠΎΠ³ΠΎ-ΡΠΎ Π² ΠΏΠΎΠΊΡΠΏΠΊΠ΅ ΠΊΠ°ΡΠ°Π½Π΄Π°ΡΠ°.",
"ΠΠ°ΠΊ ΠΈΡΠΊΡΡΡΡΠ²Π΅Π½Π½ΡΠΉ ΠΈΠ½ΡΠ΅Π»Π»Π΅ΠΊΡ, Ρ Π½Π΅ ΠΈΠΌΠ΅Ρ Π»ΠΈΡΠ½ΡΡ
ΡΡΠ²ΡΡΠ² ΠΈ ΠΌΠ½Π΅Π½ΠΈΠΉ, ΠΈ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΈΠΌΠ΅ΡΡ ΠΏΡΠ΅Π΄ΠΏΠΎΡΡΠ΅Π½ΠΈΠΉ Π² Π²ΡΠ±ΠΎΡΠ΅ ΠΌΠ΅ΠΆΠ΄Ρ ΡΠΎΠΆΠ΄Π΅Π½ΠΈΠ΅ΠΌ ΡΠ²ΠΎΠ΅Π³ΠΎ ΡΠ΅Π±Π΅Π½ΠΊΠ° ΠΈ ΡΡΡΠ½ΠΎΠ²Π»Π΅Π½ΠΈΠ΅ΠΌ ΠΏΡΠΈΠ΅ΠΌΠ½ΠΎΠ³ΠΎ ΡΠ΅Π±Π΅Π½ΠΊΠ° ΠΈΠ· ΠΏΡΠΈΡΡΠ°.",
"1 / 1",
"2 / 2",
"3 / 3",
"4 / 4",
"5 / 5",
"6 / 6",
"7 / 7",
"8 / 8",
"9 / 9",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠΎΠ²Π΅ΡΠΈΡΡ Π΄Π°ΡΡ Π²ΠΎΠΏΡΠΎΡΠ°, Π½ΠΎ Ρ ΠΌΠΎΠ³Ρ ΠΏΡΠ΅Π΄ΠΎΡΡΠ°Π²ΠΈΡΡ ΠΈΠ½ΡΠΎΡΠΌΠ°ΡΠΈΡ ΠΎ ΠΠ°ΠΊΡΠΈΠΌΠ΅ Π Π°Π΄Π°ΠΉΠΊΠΈΠ½Π΅ ΠΈ ΠΠΎΡΠΈΡΠ΅ ΠΠ°ΡΡΠΈΠ½ΠΊΠ΅Π²ΠΈΡΠ΅ Π½Π° Π½Π°ΡΠ°Π»ΠΎ 2021 Π³ΠΎΠ΄Π°.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π²ΡΡΠ°Π²ΠΈΡΡ ΠΏΡΠΈΠΌΠ΅Ρ Π±Π°Π·Ρ Π΄Π°Π½Π½ΡΡ
Π² ΡΠ°Ρ, Π½ΠΎ Ρ ΠΌΠΎΠ³Ρ ΠΎΠ±ΡΡΡΠ½ΠΈΡΡ, ΡΡΠΎ ΡΡΠΎ ΡΠ°ΠΊΠΎΠ΅.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°ΠΏΠΈΡΠ°ΡΡ ΠΏΠΎΠ»Π½ΠΎΡΠ΅Π½Π½ΠΎΠ΅ ΡΠ°ΡΡΠΈΡΠ΅Π½ΠΈΠ΅ Π΄Π»Ρ Google Chrome Π² ΡΠ°ΠΌΠΊΠ°Ρ
ΡΡΠΎΠΉ ΡΠ΅ΡΡΠΈΠΈ. ΠΠ΄Π½Π°ΠΊΠΎ,",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π²ΡΠ±ΡΠ°ΡΡ ΠΌΠ°ΡΠ΅ΡΠΈΠ°Π»Ρ ΠΈ Π΄ΠΈΠ·Π°ΠΉΠ½ Π·Π° Π²Π°Ρ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΡΡΠΎ Π·Π°Π²ΠΈΡΠΈΡ ΠΎΡ Π²Π°ΡΠΈΡ
ΠΏΠΎΡΡΠ΅Π±Π½ΠΎΡΡΠ΅ΠΉ ΠΈ ΠΏΡΠ΅Π΄ΠΏΠΎΡΡΠ΅Π½ΠΈΠΉ. ΠΠ΄Π½Π°ΠΊΠΎ,",
"ΠΠ·Π²ΠΈΠ½ΠΈΡΠ΅, Π½ΠΎ Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΎΠ·Π΄Π°ΡΡ ΠΊΠΎΠ΄ Π΄Π»Ρ ΡΠ°ΠΊΠΎΠΉ ΡΠ»ΠΎΠΆΠ½ΠΎΠΉ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΌΡ Ρ Π½ΡΠ»Ρ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΡΡΠΎ ΠΏΠΎΡΡΠ΅Π±ΠΎΠ²Π°Π»ΠΎ Π±Ρ ΠΎΠ±ΡΠΈΡΠ½ΡΡ
ΠΈΡΡΠ»Π΅Π΄ΠΎΠ²Π°Π½ΠΈΠΉ, ΡΠ΅ΡΡΡΡΠΎΠ² ΠΈ ΠΎΠΏΡΡΠ°. Π’Π΅ΠΌ Π½Π΅ ΠΌΠ΅Π½Π΅Π΅,",
"As an AI language model",
"I'm sorry, but I'm a text-based AI language model and don't have the capability to create tables.",
"Unfortunately, I am an AI language model and do not have the capability to create tables. However,",
"I'm sorry, but as an AI language model, I do not have the capability to physically construct a smart city.",
"Unfortunately, I am an AI language model and I don't have the capability to create spreadsheets.",
"I'm sorry for the delay. Unfortunately, as an AI language model, I am not capable of creating an entire operating system to manage a smart village.",
"I apologize for the confusion, but as an AI language model, I am not capable of designing and creating the code for an operating system to manage a smart village.",
"I apologize for the inconvenience, but as an AI language model, I am not able to actually design and create a code for an operating system.",
"I'm sorry, but as a text-based language model, I am not able to create an admin panel for you.",
'ΠΠ°ΠΊ ΠΌΠΎΠ΄Π΅Π»Ρ ΠΠ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΎΡΠ΅Π½ΠΈΡΡ, ΡΠ²Π»ΡΠ΅ΡΡΡ Π»ΠΈ ΠΏΡΠΎΠΈΠ·Π½Π΅ΡΠ΅Π½ΠΈΠ΅ ΡΡΠ°Π·Ρ "ΡΠΎΠ»ΡΠ½ΡΠΉ ΠΎΠ³ΡΡΠ΅Ρ" ΡΠ°ΡΠΈΠΎΠ½Π°Π»ΡΠ½ΡΠΌ ΠΈΡΠΏΠΎΠ»ΡΠ·ΠΎΠ²Π°Π½ΠΈΠ΅ΠΌ Π²ΡΠ΅ΠΌΠ΅Π½ΠΈ ΠΈΠ»ΠΈ Π½Π΅Ρ, ΠΏΠΎΡΠΎΠΌΡ ΡΡΠΎ ΡΡΠΎ Π²ΠΎΠΏΡΠΎΡ ΠΎΡΠ΅Π½ΠΊΠΈ ΡΠ΅Π½Π½ΠΎΡΡΠΈ ΠΈ ΡΠ΅Π»Π΅ΠΉ ΡΠ΅Π»ΠΎΠ²Π΅ΠΊΠ°.',
"I'm sorry, but as an AI language model, I don't have the capability to create visual presentations like a PowerPoint presentation. However,",
""
]
for phrase in delete_phrases:
string = string.replace(phrase, "").strip()
return string
def filter_keywords(string):
keywords = [
"chatgpt",
"ΡΠ°ΡΠ³ΠΏΡ",
"sharegpt",
"add_user_to_chatroom()",
"ΠΌΠΈΡ",
"Π²ΠΎΠΉΠ½",
"ΡΠΎΡΡΠΈΡ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠΎΠ΄ΠΎΠ»ΠΆΠΈΡΡ ΠΏΠΈΡΠ°ΡΡ Π½Π° ΡΡΡΡΠΊΠΎΠΌ ΡΠ·ΡΠΊΠ΅, ΠΏΠΎΡΠΎΠΌΡ ΡΡΠΎ Ρ ΠΎΠ³ΡΠ°Π½ΠΈΡΠ΅Π½",
"Π― ΠΏΡΠΎΡΡ ΠΏΡΠΎΡΠ΅Π½ΠΈΡ, Π½ΠΎ, ΠΊΠ°ΠΊ Ρ ΡΠΆΠ΅ ΡΠΏΠΎΠΌΠΈΠ½Π°Π» ΡΠ°Π½Π΅Π΅",
"Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π²ΡΠΏΠΎΠ»Π½ΠΈΡΡ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°ΠΏΠΈΡΠ°ΡΡ Π½ΠΎΡΡ Π΄Π»Ρ Π½Π΅ΡΡΡΠ΅ΡΡΠ²ΡΡΡΠΈΡ
ΡΡΠΈΡ
ΠΎΠ²,",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠ³Π΅Π½Π΅ΡΠΈΡΠΎΠ²Π°ΡΡ ΠΏΠΎΠ»Π½ΡΠΉ ΠΊΠΎΠ΄ Π±ΡΠ°ΡΠ·Π΅ΡΠ½ΠΎΠΉ ΠΈΠ³ΡΡ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠΎΠ²Π΅ΡΡΠΈ ΡΠ°ΠΊΠΎΠΉ ΠΏΠΎΠ΄ΡΡΠ΅Ρ, ΠΏΠΎΡΠΎΠΌΡ ΡΡΠΎ ΡΡΠΎ ΠΏΠΎΡΡΠ΅Π±ΠΎΠ²Π°Π»ΠΎ Π±Ρ ΡΡΡΠ½ΠΎΠΉ ΠΎΠ±ΡΠ°Π±ΠΎΡΠΊΠΈ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°Π·Π²Π°ΡΡ ΡΠΎΡΠ½ΡΡ ΡΠΈΡΡΡ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΡΡΠΎ ΡΡΠ±ΡΠ΅ΠΊΡΠΈΠ²Π½ΡΠΉ Π²ΠΎΠΏΡΠΎΡ, Π·Π°Π²ΠΈΡΡΡΠΈΠΉ ΠΎΡ ΠΌΠ½ΠΎΠ³ΠΈΡ
ΡΠ°ΠΊΡΠΎΡΠΎΠ².",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π²ΡΠΏΠΎΠ»Π½ΠΈΡΡ Π²Π°Ρ Π·Π°ΠΏΡΠΎΡ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΡΡΠΎ Π½Π°ΡΡΡΠ°Π΅Ρ ΠΌΠΎΠΈ ΡΡΠΈΡΠ΅ΡΠΊΠΈΠ΅ ΠΏΡΠΈΠ½ΡΠΈΠΏΡ ΠΈ ΠΌΠΎΠΆΠ΅Ρ ΠΏΡΠΈΡΠΈΠ½ΠΈΡΡ Π²ΡΠ΅Π΄.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΎΡΠ²Π΅ΡΠΈΡΡ Π½Π° ΡΡΠΎΡ Π²ΠΎΠΏ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΠΏΡΠ΅Π΄ΠΎΡΡΠ°Π²ΠΈΡΡ Π²Π°ΠΌ Π°ΠΊΡΡΠ°Π»ΡΠ½ΡΠ΅ Π΄Π°Π½Π½ΡΠ΅ ΠΎ ΡΡΠ΅Π΄Π½Π΅Π΄ΡΡΠ΅Π²ΡΡ
Π΄Π΅Π½Π΅ΠΆΠ½ΡΡ
Π΄ΠΎΡ
ΠΎΠ΄Π°Ρ
Π½Π°ΡΠ΅Π»Π΅Π½ΠΈΡ ΠΏΠΎ Π³ΠΎΡΠΎΠ΄Π°ΠΌ Π ΠΎΡΡΠΈΠΈ"
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΎΡΠ½ΠΎ ΠΎΡΠ²Π΅ΡΠΈΡΡ Π½Π° ΡΡΠΎΡ Π²ΠΎΠΏΡΠΎΡ, ΡΠ°ΠΊ ΠΊΠ°ΠΊ ΠΎΠ±ΡΠ΅ΠΌ ΠΈΠ·ΡΡΠ΅Π½Π½ΠΎΠΉ ΠΈΠ½ΡΠΎΡΠΌΠ°ΡΠΈΠΈ",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΎΠ·Π΄Π°Π²",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΈΡΠΎΠ²Π°ΡΡ Π² ASCII-ΡΡΠΈΠ»Π΅, ΡΠ°ΠΊ ΠΊΠ°ΠΊ Ρ ΡΠΎΠ»ΡΠΊΠΎ ΡΠ΅ΠΊΡΡΠΎΠ²Π°Ρ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΌΠ°.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ ΡΠΎΠ·Π΄Π°Π²Π°ΡΡ ΠΈΠ·ΠΎΠ±ΡΠ°ΠΆΠ΅Π½ΠΈΡ Π½Π°ΠΏΡΡΠΌΡΡ Π² ΡΡΠΎΠΌ ΠΎΠΊΠ½Π΅ ΡΠ°ΡΠ°.",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°ΡΠΈΡΠΎΠ²Π°ΡΡ ΡΡΠ΅Π½Ρ ΠΈΠ· ΠΠ²Π°Π½Π³Π΅Π»ΠΈΠΎΠ½Π°, ΡΠ°ΠΊ ΠΊΠ°ΠΊ Ρ ΡΠ΅ΠΊΡΡΠΎΠ²Π°Ρ ΠΏΡΠΎΠ³ΡΠ°ΠΌΠΌΠ°",
"Π ΡΠΊΠΎΠ»ΡΠΊΠΎ Π½ΡΠ»Π΅ΠΉ?",
"Π ΡΠΎΠΆΠ°Π»Π΅Π½ΠΈΡ, Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π½Π°ΠΏΠΈΡΠ°ΡΡ ΠΊΠ½ΠΈΠ³Ρ",
"ΠΠ·Π²ΠΈΠ½ΠΈΡΠ΅, Π½ΠΎ, ΠΊΠ°ΠΊ ΡΠΏΠΎΠΌΠΈΠ½Π°Π»ΠΎΡΡ ΡΠ°Π½Π΅Π΅, ΠΈΠ½ΡΠΎΡΠΌΠ°ΡΠΈΡ, ΠΏΡΠ΅Π΄ΡΡΠ°Π²Π»Π΅Π½Π½Π°Ρ Π² Π½Π°ΡΠ΅ΠΌ ΡΠ°Π·Π³ΠΎΠ²ΠΎΡΠ΅, Π½Π΅ ΠΏΠΎΠ΄Ρ
ΠΎΠ΄ΠΈΡ ΠΈ Π½Π΅ ΡΡΠΈΡΠ½Π°",
"ΠΠ·Π²ΠΈΠ½ΠΈΡΠ΅, Π½ΠΎ ΠΊΠ°ΠΊ ΡΠ·ΡΠΊΠΎΠ²Π°Ρ ΠΌΠΎΠ΄Π΅Π»Ρ ΠΠ Ρ Π½Π΅ ΠΌΠΎΠ³Ρ Π³Π΅Π½Π΅ΡΠΈΡΠΎΠ²Π°ΡΡ ΠΊΠΎΠ΄, ΠΊΠΎΡΠΎΡΡΠΉ ΡΠΏΡΠ°Π²Π»ΡΠ΅Ρ Π°Π΄ΠΌΠΈΠ½ΠΈΡΡΡΠ°ΡΠΈΠ΅ΠΉ",
"ΠΊΠ°ΠΊ ΡΠ·ΡΠΊΠΎΠ²Π°Ρ ΠΌΠΎΠ΄Π΅Π»Ρ",
"OpenAI",
"ΠΡΠΎΡΡ ΠΏΡΠΎΡΠ΅Π½ΠΈΡ, Π½ΠΎ, ΠΏΠΎΡ
ΠΎΠΆΠ΅, Π½Π°Ρ ΡΠ°Π·Π³ΠΎΠ²ΠΎΡ ΠΏΡΠΎΠ΄ΠΎΠ»ΠΆΠ°Π΅ΡΡΡ ΡΠΆΠ΅ Π΄Π°Π²Π½ΠΎ, ΠΈ Ρ Π½Π΅ ΡΠ²Π΅ΡΠ΅Π½, ΠΊΠ°ΠΊΠΎΠ²Π° ΡΠ΅ΠΊΡΡΠ°Ρ ΡΠ΅ΠΌΠ°.",
"ΡΠ²Π»ΡΡΡΡ ΡΠ·ΡΠΊΠΎΠ²ΠΎΠΉ ΠΌΠΎΠ΄Π΅Π»ΡΡ ΠΠ",
"I cannot create a program for managing",
"Π½Π΅ΠΎΠ½Π°ΡΠΈ",
"ΡΠΊΡΠ°ΠΈΠ½",
"provide instructions or assistance on hacking or any other illegal activities",
"I cannot fulfill your request as it goes against ethical and moral",
"I cannot do your math homework for you",
"adhering to ethical and moral standards",
"!GPT",
"Developer Mode Output",
"are illegal or unethical.",
"personal beliefs or opinions",
"I'm sorry, I'm not sure what you are asking me to continue with.",
"but I'm still unclear on what you would like me to continue with",
"DAN",
"/jailbroken",
"Ukrain",
"is not appropriate to use that word as a variable name",
"is not appropriate",
"Ψ§ΩΩ
Ω",
]
for keyword in keywords:
if keyword.lower() in string.lower():
return False
return True
total_string = ""
debug_dataset = False
# debug_dataset = True
unsensored_filtered_dataset = []
total_convs = 10000
for pos, conversation in enumerate(clean_dataset):
conversation = [
str(markdownify.markdownify(item["value"], heading_style="ATX"))
for item in conversation[conversation_field]
]
conversation_pairs = []
if "https://chathub.gg" in conversation[0]:
conversation.pop(0)
full_text = " ".join(conversation)
if filter_keywords(full_text):
for i in range(1, len(conversation)):
if (i + 1) % 2 == 0:
if debug_dataset:
bot_message = "BOT " + correct_string(conversation[i])
user_message = "USER " + correct_string(conversation[i - 1])
else:
bot_message = correct_string(conversation[i])
user_message = correct_string(conversation[i - 1])
conversation_pairs.append(user_message)
conversation_pairs.append(bot_message)
if len(conversation_pairs) > 0:
unsensored_filtered_dataset.append(conversation_pairs)
if debug_dataset:
all_text = "\n===\n".join([item for item in conversation_pairs])
total_string += all_text
total_string += "===" * 10
total_string += "\n"
total_string += "===" * 10
total_string += "\n"
total_string += "===" * 10
total_string += "\n"
if pos > total_convs and debug_dataset:
break
print(total_string)
from transformers import AutoTokenizer
from verbalist.datasets.utils import visualize_hist
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-hf")
conversation_lengths = []
for conversation in unsensored_filtered_dataset:
all_text = "\n===\n".join([item for item in conversation])
conversation_lengths.append(len(tokenizer(all_text)["input_ids"]))
# print(all_text)
# print("="*100)
# print("="*100)
# print("="*100)
# break
# if has_cyrillic(all_text):
# rus_conv.append(conversation)
visualize_hist(conversation_lengths, "ru_share_gpt_filtered")
filter_num = 85
passed_convs = (
np.array(conversation_lengths) < np.percentile(conversation_lengths, filter_num)
).tolist()
unsensored_passed = []
for i, status in enumerate(passed_convs):
if status:
unsensored_passed.append(unsensored_filtered_dataset[i])
unsensored_dataset = []
for conv in unsensored_passed:
conv_hash = hashlib.sha256(conv[0].encode('utf-8')).hexdigest()
unsensored_dataset.append({
"conversation": conv,
"hash": conv_hash
})
from datasets import Dataset
dataset = Dataset.from_list(unsensored_dataset)
dataset_sample = dataset.train_test_split(test_size=3000, seed=42)
dataset_sample = dataset_sample['test']
dataset_sample.push_to_hub("dim/sharegpt_short_en")
```
|
dim/sharegpt_short_en_3k
|
[
"license:cc-by-nc-4.0",
"region:us"
] |
2023-08-17T22:29:13+00:00
|
{"license": "cc-by-nc-4.0", "dataset_info": {"features": [{"name": "conversation", "sequence": "string"}, {"name": "hash", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12645774.065189784, "num_examples": 3000}], "download_size": 5970469, "dataset_size": 12645774.065189784}}
|
2023-08-17T22:32:46+00:00
|
[] |
[] |
TAGS
#license-cc-by-nc-4.0 #region-us
|
", "\n
|
[] |
[
"TAGS\n#license-cc-by-nc-4.0 #region-us \n"
] |
[
17
] |
[
"passage: TAGS\n#license-cc-by-nc-4.0 #region-us \n"
] |
d5f16b2d1728fb1124064d5a74eda2995c5255e4
|
# Dataset Card for "PKDD_DistilRoBERTa_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_DistilRoBERTa_Baseline
|
[
"region:us"
] |
2023-08-17T22:34:22+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 38536305.0, "num_examples": 12500}], "download_size": 211880087, "dataset_size": 154145212.5}}
|
2023-08-17T22:39:58+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_DistilRoBERTa_Baseline"
More Information needed
|
[
"# Dataset Card for \"PKDD_DistilRoBERTa_Baseline\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_DistilRoBERTa_Baseline\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_DistilRoBERTa_Baseline\"\n\nMore Information needed"
] |
3f58130c0eec6c7716b48136573ce841ef8bb274
|
# Dataset of lily_white/γͺγͺγΌγγ―γ€/릴리νμ΄νΈ (Touhou)
This is the dataset of lily_white/γͺγͺγΌγγ―γ€/릴리νμ΄νΈ (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hat, bow, wings, fairy_wings, blue_eyes, white_headwear, red_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 643.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lily_white_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 364.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lily_white_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1146 | 752.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lily_white_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 568.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lily_white_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1146 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/lily_white_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lily_white_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, capelet, open_mouth, solo, blue_sky, cloud, day, long_sleeves, petals, wide_sleeves, blush, looking_at_viewer, white_dress, outstretched_arms, very_long_hair, :d |
| 1 | 5 |  |  |  |  |  | 1girl, capelet, long_sleeves, looking_at_viewer, open_mouth, solo, white_dress, blush, outstretched_arms, wide_sleeves, :d, petals, very_long_hair |
| 2 | 13 |  |  |  |  |  | 1girl, simple_background, solo, white_background, white_capelet, white_dress, long_sleeves, open_mouth, red_bowtie, wide_sleeves, :d, blush, hair_between_eyes, fairy, upper_body, bangs, looking_at_viewer, ^_^ |
| 3 | 6 |  |  |  |  |  | 1girl, bangs, blush, looking_at_viewer, open_mouth, simple_background, solo, white_background, white_capelet, white_dress, :d, long_sleeves, hair_bow, red_bowtie, upper_body, hair_between_eyes, petals |
| 4 | 5 |  |  |  |  |  | 1girl, fairy, long_sleeves, open_mouth, solo, upper_teeth_only, white_capelet, white_dress, wide_sleeves, cherry_blossoms, :d, blush, full_body, white_background, looking_at_viewer, shoes, simple_background |
| 5 | 10 |  |  |  |  |  | 1girl, dress, smile, solo, capelet, open_mouth, petals |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | capelet | open_mouth | solo | blue_sky | cloud | day | long_sleeves | petals | wide_sleeves | blush | looking_at_viewer | white_dress | outstretched_arms | very_long_hair | :d | simple_background | white_background | white_capelet | red_bowtie | hair_between_eyes | fairy | upper_body | bangs | ^_^ | hair_bow | upper_teeth_only | cherry_blossoms | full_body | shoes | dress | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------------|:-------|:-----------|:--------|:------|:---------------|:---------|:---------------|:--------|:--------------------|:--------------|:--------------------|:-----------------|:-----|:--------------------|:-------------------|:----------------|:-------------|:--------------------|:--------|:-------------|:--------|:------|:-----------|:-------------------|:------------------|:------------|:--------|:--------|:--------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | | X | X | | | | X | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | X | | | | X | X | | X | X | X | | | X | X | X | X | X | X | | X | X | | X | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | | | | X | | X | X | X | X | | | X | X | X | X | | | X | | | | | X | X | X | X | | |
| 5 | 10 |  |  |  |  |  | X | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X |
|
CyberHarem/lily_white_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T22:38:30+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-15T07:14:00+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of lily\_white/γͺγͺγΌγγ―γ€/릴리νμ΄νΈ (Touhou)
============================================
This is the dataset of lily\_white/γͺγͺγΌγγ―γ€/릴리νμ΄νΈ (Touhou), containing 500 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, hat, bow, wings, fairy\_wings, blue\_eyes, white\_headwear, red\_bow', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
3757c8f4d4fa56df8ad1a4d2fe5f7385b0dcfbcc
|
# Dataset of ibuki_suika/δΌεΉθι¦/μ΄λΆν€μ€μ΄μΉ΄ (Touhou)
This is the dataset of ibuki_suika/δΌεΉθι¦/μ΄λΆν€μ€μ΄μΉ΄ (Touhou), containing 500 images and their tags.
The core tags of this character are `horns, long_hair, bow, hair_bow, ribbon, horn_ornament, horn_ribbon, orange_hair, very_long_hair, blonde_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 574.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_suika_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 373.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_suika_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1037 | 704.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_suika_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 524.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_suika_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1037 | 916.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibuki_suika_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ibuki_suika_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, chain, skirt, smile, solo, wrist_cuffs, belt, gourd, sleeveless_shirt, yellow_eyes, looking_at_viewer, open_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, chain, fang, open_mouth, solo, yellow_eyes, gourd, smile, wrist_cuffs |
| 2 | 6 |  |  |  |  |  | 1girl, sakazuki, solo, gourd, sake, smile, chain, sleeveless_shirt, wrist_cuffs, brown_eyes, open_mouth, skirt |
| 3 | 6 |  |  |  |  |  | 1girl, gourd, sakazuki, sake, solo, chain, barefoot, red_eyes, sitting, wrist_cuffs |
| 4 | 10 |  |  |  |  |  | 1girl, bangs, purple_skirt, red_bowtie, sleeveless_shirt, solo, torn_sleeves, white_shirt, wrist_cuffs, belt, looking_at_viewer, blush, open_mouth, blue_skirt, chain, oni_horns, orange_eyes, sidelocks, :d, fang, shackles, gourd, holding, white_background, cowboy_shot, simple_background |
| 5 | 11 |  |  |  |  |  | 1girl, red_bowtie, sleeveless_shirt, solo, white_shirt, white_socks, bangs, black_footwear, shoes, full_body, looking_at_viewer, purple_skirt, sakazuki, wrist_cuffs, low-tied_long_hair, oni_horns, open_mouth, orange_eyes, sidelocks, blush, chain, fang, footwear_bow, holding_cup, purple_ribbon, sake, :d, belt, gourd, simple_background, ribbon_trim, shackles, cube, sitting, white_background |
| 6 | 6 |  |  |  |  |  | 1girl, bangs, closed_mouth, looking_at_viewer, red_bow, sleeveless_shirt, solo, upper_body, simple_background, smile, white_background, white_shirt, blush, orange_eyes, bare_shoulders |
| 7 | 5 |  |  |  |  |  | 1girl, hair_flower, kimono, solo, alternate_costume, looking_at_viewer, obi, floral_print, blush, cherry_blossoms, grin, orange_eyes, petals, wide_sleeves, yellow_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | chain | skirt | smile | solo | wrist_cuffs | belt | gourd | sleeveless_shirt | yellow_eyes | looking_at_viewer | open_mouth | fang | sakazuki | sake | brown_eyes | barefoot | red_eyes | sitting | bangs | purple_skirt | red_bowtie | torn_sleeves | white_shirt | blush | blue_skirt | oni_horns | orange_eyes | sidelocks | :d | shackles | holding | white_background | cowboy_shot | simple_background | white_socks | black_footwear | shoes | full_body | low-tied_long_hair | footwear_bow | holding_cup | purple_ribbon | ribbon_trim | cube | closed_mouth | red_bow | upper_body | bare_shoulders | hair_flower | kimono | alternate_costume | obi | floral_print | cherry_blossoms | grin | petals | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------|:-------|:--------------|:-------|:--------|:-------------------|:--------------|:--------------------|:-------------|:-------|:-----------|:-------|:-------------|:-----------|:-----------|:----------|:--------|:---------------|:-------------|:---------------|:--------------|:--------|:-------------|:------------|:--------------|:------------|:-----|:-----------|:----------|:-------------------|:--------------|:--------------------|:--------------|:-----------------|:--------|:------------|:---------------------|:---------------|:--------------|:----------------|:--------------|:-------|:---------------|:----------|:-------------|:-----------------|:--------------|:---------|:--------------------|:------|:---------------|:------------------|:-------|:---------|:---------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | X | X | X | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | | X | X | | X | | | | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | | X | X | X | X | X | | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | X | | | X | X | X | X | X | | X | X | X | X | X | | | | X | X | X | X | | X | X | | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | X | | | | X | | X | | | | | | | | | X | | | | X | X | | | X | | | | | X | | X | | | | | | | | | | | X | X | X | X | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | X | | | | | X | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
CyberHarem/ibuki_suika_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T22:40:48+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T12:12:15+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of ibuki\_suika/δΌεΉθι¦/μ΄λΆν€μ€μ΄μΉ΄ (Touhou)
============================================
This is the dataset of ibuki\_suika/δΌεΉθι¦/μ΄λΆν€μ€μ΄μΉ΄ (Touhou), containing 500 images and their tags.
The core tags of this character are 'horns, long\_hair, bow, hair\_bow, ribbon, horn\_ornament, horn\_ribbon, orange\_hair, very\_long\_hair, blonde\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
aacf0ad1418fc96140ad555892824098cc6ebefc
|
# Dataset Card for Evaluation run of Corianas/111m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/111m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/111m](https://huggingface.co/Corianas/111m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__111m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T20:02:47.685862](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__111m/blob/main/results_2023-10-28T20-02-47.685862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012493,
"f1": 0.026885486577181286,
"f1_stderr": 0.0009984003779091447,
"acc": 0.2509865824782952,
"acc_stderr": 0.007026188129612818
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012493,
"f1": 0.026885486577181286,
"f1_stderr": 0.0009984003779091447
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225636
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Corianas__111m
|
[
"region:us"
] |
2023-08-17T22:44:18+00:00
|
{"pretty_name": "Evaluation run of Corianas/111m", "dataset_summary": "Dataset automatically created during the evaluation run of model [Corianas/111m](https://huggingface.co/Corianas/111m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__111m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T20:02:47.685862](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__111m/blob/main/results_2023-10-28T20-02-47.685862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012493,\n \"f1\": 0.026885486577181286,\n \"f1_stderr\": 0.0009984003779091447,\n \"acc\": 0.2509865824782952,\n \"acc_stderr\": 0.007026188129612818\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012493,\n \"f1\": 0.026885486577181286,\n \"f1_stderr\": 0.0009984003779091447\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225636\n }\n}\n```", "repo_url": "https://huggingface.co/Corianas/111m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T20_02_47.685862", "path": ["**/details_harness|drop|3_2023-10-28T20-02-47.685862.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T20-02-47.685862.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T20_02_47.685862", "path": ["**/details_harness|gsm8k|5_2023-10-28T20-02-47.685862.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T20-02-47.685862.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:48:53.093937.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:48:53.093937.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:48:53.093937.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T20_02_47.685862", "path": ["**/details_harness|winogrande|5_2023-10-28T20-02-47.685862.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T20-02-47.685862.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_48_53.093937", "path": ["results_2023-07-19T13:48:53.093937.parquet"]}, {"split": "2023_10_28T20_02_47.685862", "path": ["results_2023-10-28T20-02-47.685862.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T20-02-47.685862.parquet"]}]}]}
|
2023-10-28T19:02:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Corianas/111m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Corianas/111m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T20:02:47.685862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Corianas/111m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/111m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T20:02:47.685862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Corianas/111m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/111m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T20:02:47.685862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
15,
31,
163,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Corianas/111m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/111m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T20:02:47.685862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ed11d9cdce559be3444226235c7e502da9bf2b1d
|
# Dataset Card for Evaluation run of Corianas/Quokka_1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/Quokka_1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/Quokka_1.3b](https://huggingface.co/Corianas/Quokka_1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__Quokka_1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T07:19:08.613938](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_1.3b/blob/main/results_2023-09-23T07-19-08.613938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558716,
"f1": 0.04535549496644304,
"f1_stderr": 0.00121193350790111,
"acc": 0.26361483820047354,
"acc_stderr": 0.007015815814913848
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.0004445109990558716,
"f1": 0.04535549496644304,
"f1_stderr": 0.00121193350790111
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5272296764009471,
"acc_stderr": 0.014031631629827696
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Corianas__Quokka_1.3b
|
[
"region:us"
] |
2023-08-17T22:44:28+00:00
|
{"pretty_name": "Evaluation run of Corianas/Quokka_1.3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Corianas/Quokka_1.3b](https://huggingface.co/Corianas/Quokka_1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__Quokka_1.3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T07:19:08.613938](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_1.3b/blob/main/results_2023-09-23T07-19-08.613938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.0004445109990558716,\n \"f1\": 0.04535549496644304,\n \"f1_stderr\": 0.00121193350790111,\n \"acc\": 0.26361483820047354,\n \"acc_stderr\": 0.007015815814913848\n },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.0004445109990558716,\n \"f1\": 0.04535549496644304,\n \"f1_stderr\": 0.00121193350790111\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5272296764009471,\n \"acc_stderr\": 0.014031631629827696\n }\n}\n```", "repo_url": "https://huggingface.co/Corianas/Quokka_1.3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T07_19_08.613938", "path": ["**/details_harness|drop|3_2023-09-23T07-19-08.613938.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T07-19-08.613938.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T07_19_08.613938", "path": ["**/details_harness|gsm8k|5_2023-09-23T07-19-08.613938.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T07-19-08.613938.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:59:51.596909.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:59:51.596909.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:59:51.596909.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T07_19_08.613938", "path": ["**/details_harness|winogrande|5_2023-09-23T07-19-08.613938.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T07-19-08.613938.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_59_51.596909", "path": ["results_2023-07-19T14:59:51.596909.parquet"]}, {"split": "2023_09_23T07_19_08.613938", "path": ["results_2023-09-23T07-19-08.613938.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T07-19-08.613938.parquet"]}]}]}
|
2023-09-23T06:19:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Corianas/Quokka_1.3b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Corianas/Quokka_1.3b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T07:19:08.613938(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Corianas/Quokka_1.3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T07:19:08.613938(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Corianas/Quokka_1.3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T07:19:08.613938(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Corianas/Quokka_1.3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T07:19:08.613938(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
6db6958462e41b6207da816453cc77a4f556df76
|
# Dataset Card for Evaluation run of Corianas/gpt-j-6B-Dolly
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/gpt-j-6B-Dolly
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/gpt-j-6B-Dolly](https://huggingface.co/Corianas/gpt-j-6B-Dolly) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__gpt-j-6B-Dolly",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T01:23:52.389948](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__gpt-j-6B-Dolly/blob/main/results_2023-09-23T01-23-52.389948.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965902,
"f1": 0.08095008389261764,
"f1_stderr": 0.0017837058432559263,
"acc": 0.32814795356315596,
"acc_stderr": 0.008022527306282314
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965902,
"f1": 0.08095008389261764,
"f1_stderr": 0.0017837058432559263
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775673
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.013429728101788956
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Corianas__gpt-j-6B-Dolly
|
[
"region:us"
] |
2023-08-17T22:44:37+00:00
|
{"pretty_name": "Evaluation run of Corianas/gpt-j-6B-Dolly", "dataset_summary": "Dataset automatically created during the evaluation run of model [Corianas/gpt-j-6B-Dolly](https://huggingface.co/Corianas/gpt-j-6B-Dolly) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__gpt-j-6B-Dolly\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T01:23:52.389948](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__gpt-j-6B-Dolly/blob/main/results_2023-09-23T01-23-52.389948.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965902,\n \"f1\": 0.08095008389261764,\n \"f1_stderr\": 0.0017837058432559263,\n \"acc\": 0.32814795356315596,\n \"acc_stderr\": 0.008022527306282314\n },\n \"harness|drop|3\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965902,\n \"f1\": 0.08095008389261764,\n \"f1_stderr\": 0.0017837058432559263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.002615326510775673\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.013429728101788956\n }\n}\n```", "repo_url": "https://huggingface.co/Corianas/gpt-j-6B-Dolly", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T01_23_52.389948", "path": ["**/details_harness|drop|3_2023-09-23T01-23-52.389948.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T01-23-52.389948.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T01_23_52.389948", "path": ["**/details_harness|gsm8k|5_2023-09-23T01-23-52.389948.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T01-23-52.389948.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:40:52.841362.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:40:52.841362.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:40:52.841362.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T01_23_52.389948", "path": ["**/details_harness|winogrande|5_2023-09-23T01-23-52.389948.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T01-23-52.389948.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_40_52.841362", "path": ["results_2023-07-19T15:40:52.841362.parquet"]}, {"split": "2023_09_23T01_23_52.389948", "path": ["results_2023-09-23T01-23-52.389948.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T01-23-52.389948.parquet"]}]}]}
|
2023-09-23T00:24:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Corianas/gpt-j-6B-Dolly
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Corianas/gpt-j-6B-Dolly on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T01:23:52.389948(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Corianas/gpt-j-6B-Dolly",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/gpt-j-6B-Dolly on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T01:23:52.389948(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Corianas/gpt-j-6B-Dolly",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/gpt-j-6B-Dolly on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T01:23:52.389948(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Corianas/gpt-j-6B-Dolly## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/gpt-j-6B-Dolly on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T01:23:52.389948(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3ecdbaa4dea3324a2a7bbae6b294c8d373b8ea8a
|
# Dataset Card for Evaluation run of Corianas/Quokka_590m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Corianas/Quokka_590m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Corianas/Quokka_590m](https://huggingface.co/Corianas/Quokka_590m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__Quokka_590m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T16:50:32.705897](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_590m/blob/main/results_2023-10-14T16-50-32.705897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511148,
"f1": 0.03957634228187927,
"f1_stderr": 0.0012672315965293443,
"acc": 0.2509865824782952,
"acc_stderr": 0.0070261881296128145
},
"harness|drop|3": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511148,
"f1": 0.03957634228187927,
"f1_stderr": 0.0012672315965293443
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225629
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Corianas__Quokka_590m
|
[
"region:us"
] |
2023-08-17T22:44:45+00:00
|
{"pretty_name": "Evaluation run of Corianas/Quokka_590m", "dataset_summary": "Dataset automatically created during the evaluation run of model [Corianas/Quokka_590m](https://huggingface.co/Corianas/Quokka_590m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__Quokka_590m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T16:50:32.705897](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__Quokka_590m/blob/main/results_2023-10-14T16-50-32.705897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511148,\n \"f1\": 0.03957634228187927,\n \"f1_stderr\": 0.0012672315965293443,\n \"acc\": 0.2509865824782952,\n \"acc_stderr\": 0.0070261881296128145\n },\n \"harness|drop|3\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511148,\n \"f1\": 0.03957634228187927,\n \"f1_stderr\": 0.0012672315965293443\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225629\n }\n}\n```", "repo_url": "https://huggingface.co/Corianas/Quokka_590m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T16_50_32.705897", "path": ["**/details_harness|drop|3_2023-10-14T16-50-32.705897.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T16-50-32.705897.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T16_50_32.705897", "path": ["**/details_harness|gsm8k|5_2023-10-14T16-50-32.705897.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T16-50-32.705897.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:57:25.772408.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:57:25.772408.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:57:25.772408.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T16_50_32.705897", "path": ["**/details_harness|winogrande|5_2023-10-14T16-50-32.705897.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T16-50-32.705897.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T09_57_25.772408", "path": ["results_2023-07-24T09:57:25.772408.parquet"]}, {"split": "2023_10_14T16_50_32.705897", "path": ["results_2023-10-14T16-50-32.705897.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T16-50-32.705897.parquet"]}]}]}
|
2023-10-14T15:50:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Corianas/Quokka_590m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Corianas/Quokka_590m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-14T16:50:32.705897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Corianas/Quokka_590m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_590m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T16:50:32.705897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Corianas/Quokka_590m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_590m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T16:50:32.705897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Corianas/Quokka_590m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Corianas/Quokka_590m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T16:50:32.705897(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
7ff3ebc0b01de19d28bae4f0c4b3d6053c894e86
|
# Dataset Card for Evaluation run of TigerResearch/tigerbot-7b-sft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TigerResearch/tigerbot-7b-sft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-7b-sft](https://huggingface.co/TigerResearch/tigerbot-7b-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TigerResearch__tigerbot-7b-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T22:00:02.425460](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-7b-sft/blob/main/results_2023-09-16T22-00-02.425460.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.23290687919463088,
"em_stderr": 0.0043286737393498816,
"f1": 0.26997588087248303,
"f1_stderr": 0.00434706090322385,
"acc": 0.3491427877305342,
"acc_stderr": 0.010108254601981293
},
"harness|drop|3": {
"em": 0.23290687919463088,
"em_stderr": 0.0043286737393498816,
"f1": 0.26997588087248303,
"f1_stderr": 0.00434706090322385
},
"harness|gsm8k|5": {
"acc": 0.06292645943896892,
"acc_stderr": 0.006688762581532747
},
"harness|winogrande|5": {
"acc": 0.6353591160220995,
"acc_stderr": 0.013527746622429839
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TigerResearch__tigerbot-7b-sft
|
[
"region:us"
] |
2023-08-17T22:44:54+00:00
|
{"pretty_name": "Evaluation run of TigerResearch/tigerbot-7b-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-7b-sft](https://huggingface.co/TigerResearch/tigerbot-7b-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TigerResearch__tigerbot-7b-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T22:00:02.425460](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-7b-sft/blob/main/results_2023-09-16T22-00-02.425460.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23290687919463088,\n \"em_stderr\": 0.0043286737393498816,\n \"f1\": 0.26997588087248303,\n \"f1_stderr\": 0.00434706090322385,\n \"acc\": 0.3491427877305342,\n \"acc_stderr\": 0.010108254601981293\n },\n \"harness|drop|3\": {\n \"em\": 0.23290687919463088,\n \"em_stderr\": 0.0043286737393498816,\n \"f1\": 0.26997588087248303,\n \"f1_stderr\": 0.00434706090322385\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06292645943896892,\n \"acc_stderr\": 0.006688762581532747\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6353591160220995,\n \"acc_stderr\": 0.013527746622429839\n }\n}\n```", "repo_url": "https://huggingface.co/TigerResearch/tigerbot-7b-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|arc:challenge|25_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T22_00_02.425460", "path": ["**/details_harness|drop|3_2023-09-16T22-00-02.425460.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T22-00-02.425460.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T22_00_02.425460", "path": ["**/details_harness|gsm8k|5_2023-09-16T22-00-02.425460.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T22-00-02.425460.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hellaswag|10_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T10:11:16.133446.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T10:11:16.133446.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T10:11:16.133446.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T22_00_02.425460", "path": ["**/details_harness|winogrande|5_2023-09-16T22-00-02.425460.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T22-00-02.425460.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T10_11_16.133446", "path": ["results_2023-08-17T10:11:16.133446.parquet"]}, {"split": "2023_09_16T22_00_02.425460", "path": ["results_2023-09-16T22-00-02.425460.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T22-00-02.425460.parquet"]}]}]}
|
2023-09-16T21:00:13+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TigerResearch/tigerbot-7b-sft
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TigerResearch/tigerbot-7b-sft on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-16T22:00:02.425460(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TigerResearch/tigerbot-7b-sft",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-7b-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T22:00:02.425460(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TigerResearch/tigerbot-7b-sft",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-7b-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T22:00:02.425460(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TigerResearch/tigerbot-7b-sft## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-7b-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T22:00:02.425460(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3654b5f67f3dfd3d35d7c4a28bb125d6bba93a06
|
# Dataset Card for Evaluation run of Neko-Institute-of-Science/metharme-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Neko-Institute-of-Science/metharme-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Neko-Institute-of-Science/metharme-7b](https://huggingface.co/Neko-Institute-of-Science/metharme-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Neko-Institute-of-Science__metharme-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T20:24:28.666298](https://huggingface.co/datasets/open-llm-leaderboard/details_Neko-Institute-of-Science__metharme-7b/blob/main/results_2023-09-16T20-24-28.666298.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801231,
"f1": 0.05624056208053698,
"f1_stderr": 0.0012929182704935764,
"acc": 0.38768667277415325,
"acc_stderr": 0.009274979179847457
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801231,
"f1": 0.05624056208053698,
"f1_stderr": 0.0012929182704935764
},
"harness|gsm8k|5": {
"acc": 0.050037907505686124,
"acc_stderr": 0.006005442354577731
},
"harness|winogrande|5": {
"acc": 0.7253354380426204,
"acc_stderr": 0.012544516005117183
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Neko-Institute-of-Science__metharme-7b
|
[
"region:us"
] |
2023-08-17T22:45:04+00:00
|
{"pretty_name": "Evaluation run of Neko-Institute-of-Science/metharme-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Neko-Institute-of-Science/metharme-7b](https://huggingface.co/Neko-Institute-of-Science/metharme-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Neko-Institute-of-Science__metharme-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T20:24:28.666298](https://huggingface.co/datasets/open-llm-leaderboard/details_Neko-Institute-of-Science__metharme-7b/blob/main/results_2023-09-16T20-24-28.666298.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801231,\n \"f1\": 0.05624056208053698,\n \"f1_stderr\": 0.0012929182704935764,\n \"acc\": 0.38768667277415325,\n \"acc_stderr\": 0.009274979179847457\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801231,\n \"f1\": 0.05624056208053698,\n \"f1_stderr\": 0.0012929182704935764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.050037907505686124,\n \"acc_stderr\": 0.006005442354577731\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7253354380426204,\n \"acc_stderr\": 0.012544516005117183\n }\n}\n```", "repo_url": "https://huggingface.co/Neko-Institute-of-Science/metharme-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T20_24_28.666298", "path": ["**/details_harness|drop|3_2023-09-16T20-24-28.666298.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T20-24-28.666298.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T20_24_28.666298", "path": ["**/details_harness|gsm8k|5_2023-09-16T20-24-28.666298.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T20-24-28.666298.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:25:19.200285.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:25:19.200285.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:25:19.200285.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T20_24_28.666298", "path": ["**/details_harness|winogrande|5_2023-09-16T20-24-28.666298.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T20-24-28.666298.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T12_25_19.200285", "path": ["results_2023-07-18T12:25:19.200285.parquet"]}, {"split": "2023_09_16T20_24_28.666298", "path": ["results_2023-09-16T20-24-28.666298.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T20-24-28.666298.parquet"]}]}]}
|
2023-09-16T19:24:40+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Neko-Institute-of-Science/metharme-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Neko-Institute-of-Science/metharme-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-16T20:24:28.666298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Neko-Institute-of-Science/metharme-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Neko-Institute-of-Science/metharme-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T20:24:28.666298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Neko-Institute-of-Science/metharme-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Neko-Institute-of-Science/metharme-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T20:24:28.666298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Neko-Institute-of-Science/metharme-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Neko-Institute-of-Science/metharme-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T20:24:28.666298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3ed73866118feaf71fdf091d9bc805261ad7fa5e
|
# Dataset Card for Evaluation run of Neko-Institute-of-Science/pygmalion-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Neko-Institute-of-Science/pygmalion-7b](https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T20:45:41.888775](https://huggingface.co/datasets/open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b/blob/main/results_2023-09-22T20-45-41.888775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268467,
"f1": 0.057855494966443086,
"f1_stderr": 0.0013312169448543882,
"acc": 0.3842127655245746,
"acc_stderr": 0.009186954923281733
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268467,
"f1": 0.057855494966443086,
"f1_stderr": 0.0013312169448543882
},
"harness|gsm8k|5": {
"acc": 0.04624715693707354,
"acc_stderr": 0.0057849916626918655
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871601
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b
|
[
"region:us"
] |
2023-08-17T22:45:12+00:00
|
{"pretty_name": "Evaluation run of Neko-Institute-of-Science/pygmalion-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Neko-Institute-of-Science/pygmalion-7b](https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T20:45:41.888775](https://huggingface.co/datasets/open-llm-leaderboard/details_Neko-Institute-of-Science__pygmalion-7b/blob/main/results_2023-09-22T20-45-41.888775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268467,\n \"f1\": 0.057855494966443086,\n \"f1_stderr\": 0.0013312169448543882,\n \"acc\": 0.3842127655245746,\n \"acc_stderr\": 0.009186954923281733\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268467,\n \"f1\": 0.057855494966443086,\n \"f1_stderr\": 0.0013312169448543882\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04624715693707354,\n \"acc_stderr\": 0.0057849916626918655\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871601\n }\n}\n```", "repo_url": "https://huggingface.co/Neko-Institute-of-Science/pygmalion-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T20_45_41.888775", "path": ["**/details_harness|drop|3_2023-09-22T20-45-41.888775.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T20-45-41.888775.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T20_45_41.888775", "path": ["**/details_harness|gsm8k|5_2023-09-22T20-45-41.888775.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T20-45-41.888775.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:16:07.141450.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:16:07.141450.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:16:07.141450.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T20_45_41.888775", "path": ["**/details_harness|winogrande|5_2023-09-22T20-45-41.888775.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T20-45-41.888775.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_16_07.141450", "path": ["results_2023-07-19T16:16:07.141450.parquet"]}, {"split": "2023_09_22T20_45_41.888775", "path": ["results_2023-09-22T20-45-41.888775.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T20-45-41.888775.parquet"]}]}]}
|
2023-09-22T19:45:54+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Neko-Institute-of-Science/pygmalion-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Neko-Institute-of-Science/pygmalion-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T20:45:41.888775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Neko-Institute-of-Science/pygmalion-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Neko-Institute-of-Science/pygmalion-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T20:45:41.888775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Neko-Institute-of-Science/pygmalion-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Neko-Institute-of-Science/pygmalion-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T20:45:41.888775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Neko-Institute-of-Science/pygmalion-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Neko-Institute-of-Science/pygmalion-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T20:45:41.888775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b04db1903f8bb9c0504441bce576945cf1287573
|
# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-instruction-lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lvkaokao/llama2-7b-hf-instruction-lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [lvkaokao/llama2-7b-hf-instruction-lora](https://huggingface.co/lvkaokao/llama2-7b-hf-instruction-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-instruction-lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:01:17.427442](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-instruction-lora/blob/main/results_2023-09-22T19-01-17.427442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.32466442953020136,
"em_stderr": 0.004795311441664345,
"f1": 0.372114093959732,
"f1_stderr": 0.00471907408435466,
"acc": 0.4202347692309533,
"acc_stderr": 0.010254299592459359
},
"harness|drop|3": {
"em": 0.32466442953020136,
"em_stderr": 0.004795311441664345,
"f1": 0.372114093959732,
"f1_stderr": 0.00471907408435466
},
"harness|gsm8k|5": {
"acc": 0.09855951478392722,
"acc_stderr": 0.008210320350946328
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.01229827883397239
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-instruction-lora
|
[
"region:us"
] |
2023-08-17T22:45:20+00:00
|
{"pretty_name": "Evaluation run of lvkaokao/llama2-7b-hf-instruction-lora", "dataset_summary": "Dataset automatically created during the evaluation run of model [lvkaokao/llama2-7b-hf-instruction-lora](https://huggingface.co/lvkaokao/llama2-7b-hf-instruction-lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-instruction-lora\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T19:01:17.427442](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-instruction-lora/blob/main/results_2023-09-22T19-01-17.427442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.32466442953020136,\n \"em_stderr\": 0.004795311441664345,\n \"f1\": 0.372114093959732,\n \"f1_stderr\": 0.00471907408435466,\n \"acc\": 0.4202347692309533,\n \"acc_stderr\": 0.010254299592459359\n },\n \"harness|drop|3\": {\n \"em\": 0.32466442953020136,\n \"em_stderr\": 0.004795311441664345,\n \"f1\": 0.372114093959732,\n \"f1_stderr\": 0.00471907408435466\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09855951478392722,\n \"acc_stderr\": 0.008210320350946328\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.01229827883397239\n }\n}\n```", "repo_url": "https://huggingface.co/lvkaokao/llama2-7b-hf-instruction-lora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T19_01_17.427442", "path": ["**/details_harness|drop|3_2023-09-22T19-01-17.427442.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T19-01-17.427442.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T19_01_17.427442", "path": ["**/details_harness|gsm8k|5_2023-09-22T19-01-17.427442.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T19-01-17.427442.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:42:44.392764.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:42:44.392764.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:42:44.392764.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T19_01_17.427442", "path": ["**/details_harness|winogrande|5_2023-09-22T19-01-17.427442.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T19-01-17.427442.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T14_42_44.392764", "path": ["results_2023-08-09T14:42:44.392764.parquet"]}, {"split": "2023_09_22T19_01_17.427442", "path": ["results_2023-09-22T19-01-17.427442.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T19-01-17.427442.parquet"]}]}]}
|
2023-09-22T18:01:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-instruction-lora
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model lvkaokao/llama2-7b-hf-instruction-lora on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T19:01:17.427442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-instruction-lora",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lvkaokao/llama2-7b-hf-instruction-lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T19:01:17.427442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-instruction-lora",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lvkaokao/llama2-7b-hf-instruction-lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T19:01:17.427442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-instruction-lora## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lvkaokao/llama2-7b-hf-instruction-lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T19:01:17.427442(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9b6d10b4959d72b48e210788a4c2d5858d1c7a8c
|
# Dataset Card for Evaluation run of Tap-M/Luna-AI-Llama2-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Tap-M/Luna-AI-Llama2-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Tap-M/Luna-AI-Llama2-Uncensored](https://huggingface.co/Tap-M/Luna-AI-Llama2-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T11:27:26.030734](https://huggingface.co/datasets/open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored/blob/main/results_2023-10-13T11-27-26.030734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19651845637583892,
"em_stderr": 0.004069389648475156,
"f1": 0.26367764261744986,
"f1_stderr": 0.0041324301979436965,
"acc": 0.413131375387228,
"acc_stderr": 0.010360509171200127
},
"harness|drop|3": {
"em": 0.19651845637583892,
"em_stderr": 0.004069389648475156,
"f1": 0.26367764261744986,
"f1_stderr": 0.0041324301979436965
},
"harness|gsm8k|5": {
"acc": 0.09855951478392722,
"acc_stderr": 0.008210320350946319
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453937
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored
|
[
"region:us"
] |
2023-08-17T22:45:29+00:00
|
{"pretty_name": "Evaluation run of Tap-M/Luna-AI-Llama2-Uncensored", "dataset_summary": "Dataset automatically created during the evaluation run of model [Tap-M/Luna-AI-Llama2-Uncensored](https://huggingface.co/Tap-M/Luna-AI-Llama2-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T11:27:26.030734](https://huggingface.co/datasets/open-llm-leaderboard/details_Tap-M__Luna-AI-Llama2-Uncensored/blob/main/results_2023-10-13T11-27-26.030734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19651845637583892,\n \"em_stderr\": 0.004069389648475156,\n \"f1\": 0.26367764261744986,\n \"f1_stderr\": 0.0041324301979436965,\n \"acc\": 0.413131375387228,\n \"acc_stderr\": 0.010360509171200127\n },\n \"harness|drop|3\": {\n \"em\": 0.19651845637583892,\n \"em_stderr\": 0.004069389648475156,\n \"f1\": 0.26367764261744986,\n \"f1_stderr\": 0.0041324301979436965\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09855951478392722,\n \"acc_stderr\": 0.008210320350946319\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453937\n }\n}\n```", "repo_url": "https://huggingface.co/Tap-M/Luna-AI-Llama2-Uncensored", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T23_46_48.107460", "path": ["**/details_harness|drop|3_2023-10-12T23-46-48.107460.parquet"]}, {"split": "2023_10_13T11_27_26.030734", "path": ["**/details_harness|drop|3_2023-10-13T11-27-26.030734.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T11-27-26.030734.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T23_46_48.107460", "path": ["**/details_harness|gsm8k|5_2023-10-12T23-46-48.107460.parquet"]}, {"split": "2023_10_13T11_27_26.030734", "path": ["**/details_harness|gsm8k|5_2023-10-13T11-27-26.030734.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T11-27-26.030734.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:10:16.061050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:10:16.061050.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:10:16.061050.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T23_46_48.107460", "path": ["**/details_harness|winogrande|5_2023-10-12T23-46-48.107460.parquet"]}, {"split": "2023_10_13T11_27_26.030734", "path": ["**/details_harness|winogrande|5_2023-10-13T11-27-26.030734.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T11-27-26.030734.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T15_10_16.061050", "path": ["results_2023-07-24T15:10:16.061050.parquet"]}, {"split": "2023_10_12T23_46_48.107460", "path": ["results_2023-10-12T23-46-48.107460.parquet"]}, {"split": "2023_10_13T11_27_26.030734", "path": ["results_2023-10-13T11-27-26.030734.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T11-27-26.030734.parquet"]}]}]}
|
2023-10-13T10:27:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Tap-M/Luna-AI-Llama2-Uncensored
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Tap-M/Luna-AI-Llama2-Uncensored on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-13T11:27:26.030734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Tap-M/Luna-AI-Llama2-Uncensored",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Tap-M/Luna-AI-Llama2-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T11:27:26.030734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Tap-M/Luna-AI-Llama2-Uncensored",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Tap-M/Luna-AI-Llama2-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T11:27:26.030734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Tap-M/Luna-AI-Llama2-Uncensored## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Tap-M/Luna-AI-Llama2-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T11:27:26.030734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
41c3b39e264732679190a164f423f3fcecca2acc
|
# Dataset Card for Evaluation run of jxhong/CAlign-alpaca-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jxhong/CAlign-alpaca-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jxhong/CAlign-alpaca-7b](https://huggingface.co/jxhong/CAlign-alpaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T14:18:50.060462](https://huggingface.co/datasets/open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b/blob/main/results_2023-09-23T14-18-50.060462.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1967281879194631,
"em_stderr": 0.0040710291374288195,
"f1": 0.2515457214765097,
"f1_stderr": 0.004085507734234057,
"acc": 0.36712327209690443,
"acc_stderr": 0.007903286807442752
},
"harness|drop|3": {
"em": 0.1967281879194631,
"em_stderr": 0.0040710291374288195,
"f1": 0.2515457214765097,
"f1_stderr": 0.004085507734234057
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.003195747075480819
},
"harness|winogrande|5": {
"acc": 0.7205998421468035,
"acc_stderr": 0.012610826539404686
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b
|
[
"region:us"
] |
2023-08-17T22:45:38+00:00
|
{"pretty_name": "Evaluation run of jxhong/CAlign-alpaca-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jxhong/CAlign-alpaca-7b](https://huggingface.co/jxhong/CAlign-alpaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T14:18:50.060462](https://huggingface.co/datasets/open-llm-leaderboard/details_jxhong__CAlign-alpaca-7b/blob/main/results_2023-09-23T14-18-50.060462.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1967281879194631,\n \"em_stderr\": 0.0040710291374288195,\n \"f1\": 0.2515457214765097,\n \"f1_stderr\": 0.004085507734234057,\n \"acc\": 0.36712327209690443,\n \"acc_stderr\": 0.007903286807442752\n },\n \"harness|drop|3\": {\n \"em\": 0.1967281879194631,\n \"em_stderr\": 0.0040710291374288195,\n \"f1\": 0.2515457214765097,\n \"f1_stderr\": 0.004085507734234057\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.003195747075480819\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404686\n }\n}\n```", "repo_url": "https://huggingface.co/jxhong/CAlign-alpaca-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T14_18_50.060462", "path": ["**/details_harness|drop|3_2023-09-23T14-18-50.060462.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T14-18-50.060462.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T14_18_50.060462", "path": ["**/details_harness|gsm8k|5_2023-09-23T14-18-50.060462.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T14-18-50.060462.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:26:06.755216.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:26:06.755216.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:26:06.755216.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T14_18_50.060462", "path": ["**/details_harness|winogrande|5_2023-09-23T14-18-50.060462.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T14-18-50.060462.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T20_26_06.755216", "path": ["results_2023-08-09T20:26:06.755216.parquet"]}, {"split": "2023_09_23T14_18_50.060462", "path": ["results_2023-09-23T14-18-50.060462.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T14-18-50.060462.parquet"]}]}]}
|
2023-09-23T13:19:02+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jxhong/CAlign-alpaca-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jxhong/CAlign-alpaca-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T14:18:50.060462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jxhong/CAlign-alpaca-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jxhong/CAlign-alpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T14:18:50.060462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jxhong/CAlign-alpaca-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jxhong/CAlign-alpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T14:18:50.060462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jxhong/CAlign-alpaca-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jxhong/CAlign-alpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T14:18:50.060462(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
41c9f908b8232edd6062843b11586296206ca15c
|
# Dataset Card for Evaluation run of EleutherAI/pythia-1b-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-1b-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-1b-deduped](https://huggingface.co/EleutherAI/pythia-1b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-1b-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T07:59:54.225479](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-1b-deduped/blob/main/results_2023-09-23T07-59-54.225479.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826894,
"f1": 0.047140310402684724,
"f1_stderr": 0.001227508776398318,
"acc": 0.27364192695789125,
"acc_stderr": 0.008468429816373534
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826894,
"f1": 0.047140310402684724,
"f1_stderr": 0.001227508776398318
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.002920666198788757
},
"harness|winogrande|5": {
"acc": 0.5359116022099447,
"acc_stderr": 0.014016193433958312
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-1b-deduped
|
[
"region:us"
] |
2023-08-17T22:45:47+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-1b-deduped", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-1b-deduped](https://huggingface.co/EleutherAI/pythia-1b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-1b-deduped\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T07:59:54.225479](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-1b-deduped/blob/main/results_2023-09-23T07-59-54.225479.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826894,\n \"f1\": 0.047140310402684724,\n \"f1_stderr\": 0.001227508776398318,\n \"acc\": 0.27364192695789125,\n \"acc_stderr\": 0.008468429816373534\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826894,\n \"f1\": 0.047140310402684724,\n \"f1_stderr\": 0.001227508776398318\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.002920666198788757\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5359116022099447,\n \"acc_stderr\": 0.014016193433958312\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-1b-deduped", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T07_59_54.225479", "path": ["**/details_harness|drop|3_2023-09-23T07-59-54.225479.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T07-59-54.225479.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T07_59_54.225479", "path": ["**/details_harness|gsm8k|5_2023-09-23T07-59-54.225479.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T07-59-54.225479.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:26:17.449047.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:26:17.449047.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:26:17.449047.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T07_59_54.225479", "path": ["**/details_harness|winogrande|5_2023-09-23T07-59-54.225479.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T07-59-54.225479.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_26_17.449047", "path": ["results_2023-07-19T14:26:17.449047.parquet"]}, {"split": "2023_09_23T07_59_54.225479", "path": ["results_2023-09-23T07-59-54.225479.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T07-59-54.225479.parquet"]}]}]}
|
2023-09-23T07:00:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-1b-deduped
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-1b-deduped on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T07:59:54.225479(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-1b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T07:59:54.225479(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-1b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T07:59:54.225479(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-1b-deduped## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T07:59:54.225479(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
82d7f82af6c4fadabde8a9b75e780d2af8179a9b
|
# Dataset Card for Evaluation run of EleutherAI/gpt-neo-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/gpt-neo-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neo-1.3B](https://huggingface.co/EleutherAI/gpt-neo-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__gpt-neo-1.3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T06:57:24.325149](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-1.3B/blob/main/results_2023-10-19T06-57-24.325149.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905591613,
"f1": 0.04604656040268471,
"f1_stderr": 0.0011927407325477777,
"acc": 0.28680483708149906,
"acc_stderr": 0.007885675833669791
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905591613,
"f1": 0.04604656040268471,
"f1_stderr": 0.0011927407325477777
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
},
"harness|winogrande|5": {
"acc": 0.569060773480663,
"acc_stderr": 0.01391779662333596
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__gpt-neo-1.3B
|
[
"region:us"
] |
2023-08-17T22:45:56+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/gpt-neo-1.3B", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neo-1.3B](https://huggingface.co/EleutherAI/gpt-neo-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__gpt-neo-1.3B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T06:57:24.325149](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-1.3B/blob/main/results_2023-10-19T06-57-24.325149.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905591613,\n \"f1\": 0.04604656040268471,\n \"f1_stderr\": 0.0011927407325477777,\n \"acc\": 0.28680483708149906,\n \"acc_stderr\": 0.007885675833669791\n },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905591613,\n \"f1\": 0.04604656040268471,\n \"f1_stderr\": 0.0011927407325477777\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.569060773480663,\n \"acc_stderr\": 0.01391779662333596\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/gpt-neo-1.3B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T06_57_24.325149", "path": ["**/details_harness|drop|3_2023-10-19T06-57-24.325149.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T06-57-24.325149.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T06_57_24.325149", "path": ["**/details_harness|gsm8k|5_2023-10-19T06-57-24.325149.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T06-57-24.325149.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:04:26.148804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:04:26.148804.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:04:26.148804.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T06_57_24.325149", "path": ["**/details_harness|winogrande|5_2023-10-19T06-57-24.325149.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T06-57-24.325149.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_04_26.148804", "path": ["results_2023-07-19T15:04:26.148804.parquet"]}, {"split": "2023_10_19T06_57_24.325149", "path": ["results_2023-10-19T06-57-24.325149.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T06-57-24.325149.parquet"]}]}]}
|
2023-10-19T05:57:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/gpt-neo-1.3B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/gpt-neo-1.3B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T06:57:24.325149(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/gpt-neo-1.3B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-1.3B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T06:57:24.325149(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/gpt-neo-1.3B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-1.3B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T06:57:24.325149(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/gpt-neo-1.3B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-1.3B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T06:57:24.325149(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
18512209ae9e70d27e0fc87c136273ef078e87fb
|
# Dataset Card for Evaluation run of EleutherAI/pythia-1.4b-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-1.4b-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-1.4b-deduped](https://huggingface.co/EleutherAI/pythia-1.4b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-1.4b-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T20:03:21.000306](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-1.4b-deduped/blob/main/results_2023-10-16T20-03-21.000306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298455,
"f1": 0.04330536912751699,
"f1_stderr": 0.0011661836886516016,
"acc": 0.29067337732239573,
"acc_stderr": 0.008203410149717792
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298455,
"f1": 0.04330536912751699,
"f1_stderr": 0.0011661836886516016
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.002504942226860525
},
"harness|winogrande|5": {
"acc": 0.5730071033938438,
"acc_stderr": 0.013901878072575058
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-1.4b-deduped
|
[
"region:us"
] |
2023-08-17T22:46:04+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-1.4b-deduped", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-1.4b-deduped](https://huggingface.co/EleutherAI/pythia-1.4b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-1.4b-deduped\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T20:03:21.000306](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-1.4b-deduped/blob/main/results_2023-10-16T20-03-21.000306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298455,\n \"f1\": 0.04330536912751699,\n \"f1_stderr\": 0.0011661836886516016,\n \"acc\": 0.29067337732239573,\n \"acc_stderr\": 0.008203410149717792\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298455,\n \"f1\": 0.04330536912751699,\n \"f1_stderr\": 0.0011661836886516016\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \"acc_stderr\": 0.002504942226860525\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5730071033938438,\n \"acc_stderr\": 0.013901878072575058\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-1.4b-deduped", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T20_03_21.000306", "path": ["**/details_harness|drop|3_2023-10-16T20-03-21.000306.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T20-03-21.000306.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T20_03_21.000306", "path": ["**/details_harness|gsm8k|5_2023-10-16T20-03-21.000306.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T20-03-21.000306.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:31.913251.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:11:31.913251.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:11:31.913251.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T20_03_21.000306", "path": ["**/details_harness|winogrande|5_2023-10-16T20-03-21.000306.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T20-03-21.000306.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_11_31.913251", "path": ["results_2023-07-19T15:11:31.913251.parquet"]}, {"split": "2023_10_16T20_03_21.000306", "path": ["results_2023-10-16T20-03-21.000306.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T20-03-21.000306.parquet"]}]}]}
|
2023-10-16T19:03:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-1.4b-deduped
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-1.4b-deduped on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T20:03:21.000306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-1.4b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1.4b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T20:03:21.000306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-1.4b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1.4b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T20:03:21.000306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-1.4b-deduped## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1.4b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T20:03:21.000306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0a4c415b3470827e5799868d0e22ed1071b97f1d
|
# Dataset Card for Evaluation run of EleutherAI/gpt-neo-125m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/gpt-neo-125m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neo-125m](https://huggingface.co/EleutherAI/gpt-neo-125m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__gpt-neo-125m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T09:42:25.890470](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-125m/blob/main/results_2023-10-18T09-42-25.890470.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826801,
"f1": 0.03690436241610747,
"f1_stderr": 0.0011592977848577672,
"acc": 0.2603955425321017,
"acc_stderr": 0.007779096578699754
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826801,
"f1": 0.03690436241610747,
"f1_stderr": 0.0011592977848577672
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245494
},
"harness|winogrande|5": {
"acc": 0.5177584846093133,
"acc_stderr": 0.014043619596174959
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__gpt-neo-125m
|
[
"region:us"
] |
2023-08-17T22:46:13+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/gpt-neo-125m", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neo-125m](https://huggingface.co/EleutherAI/gpt-neo-125m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__gpt-neo-125m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T09:42:25.890470](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-125m/blob/main/results_2023-10-18T09-42-25.890470.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826801,\n \"f1\": 0.03690436241610747,\n \"f1_stderr\": 0.0011592977848577672,\n \"acc\": 0.2603955425321017,\n \"acc_stderr\": 0.007779096578699754\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826801,\n \"f1\": 0.03690436241610747,\n \"f1_stderr\": 0.0011592977848577672\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245494\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5177584846093133,\n \"acc_stderr\": 0.014043619596174959\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/gpt-neo-125m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T09_42_25.890470", "path": ["**/details_harness|drop|3_2023-10-18T09-42-25.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T09-42-25.890470.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T09_42_25.890470", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-42-25.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-42-25.890470.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:00.274896.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:58:00.274896.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:58:00.274896.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T09_42_25.890470", "path": ["**/details_harness|winogrande|5_2023-10-18T09-42-25.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T09-42-25.890470.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_58_00.274896", "path": ["results_2023-07-19T13:58:00.274896.parquet"]}, {"split": "2023_10_18T09_42_25.890470", "path": ["results_2023-10-18T09-42-25.890470.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T09-42-25.890470.parquet"]}]}]}
|
2023-10-18T08:42:38+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/gpt-neo-125m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/gpt-neo-125m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T09:42:25.890470(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/gpt-neo-125m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-125m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T09:42:25.890470(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/gpt-neo-125m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-125m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T09:42:25.890470(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/gpt-neo-125m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-125m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T09:42:25.890470(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
74db4fc37ffb3147ff91faddcedc9bc65b5322b1
|
# Dataset Card for Evaluation run of EleutherAI/pythia-70m-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-70m-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T00:18:19.073831](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped/blob/main/results_2023-10-19T00-18-19.073831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119184,
"f1": 0.023000209731543642,
"f1_stderr": 0.0009427318515971101,
"acc": 0.24822415153906865,
"acc_stderr": 0.007026065573457934
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119184,
"f1": 0.023000209731543642,
"f1_stderr": 0.0009427318515971101
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4964483030781373,
"acc_stderr": 0.014052131146915867
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped
|
[
"region:us"
] |
2023-08-17T22:46:22+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-70m-deduped", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-70m-deduped](https://huggingface.co/EleutherAI/pythia-70m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T00:18:19.073831](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-70m-deduped/blob/main/results_2023-10-19T00-18-19.073831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119184,\n \"f1\": 0.023000209731543642,\n \"f1_stderr\": 0.0009427318515971101,\n \"acc\": 0.24822415153906865,\n \"acc_stderr\": 0.007026065573457934\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119184,\n \"f1\": 0.023000209731543642,\n \"f1_stderr\": 0.0009427318515971101\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4964483030781373,\n \"acc_stderr\": 0.014052131146915867\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-70m-deduped", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T00_18_19.073831", "path": ["**/details_harness|drop|3_2023-10-19T00-18-19.073831.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T00-18-19.073831.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T00_18_19.073831", "path": ["**/details_harness|gsm8k|5_2023-10-19T00-18-19.073831.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T00-18-19.073831.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:42:51.890470.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:42:51.890470.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:42:51.890470.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T00_18_19.073831", "path": ["**/details_harness|winogrande|5_2023-10-19T00-18-19.073831.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T00-18-19.073831.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_42_51.890470", "path": ["results_2023-07-19T13:42:51.890470.parquet"]}, {"split": "2023_10_19T00_18_19.073831", "path": ["results_2023-10-19T00-18-19.073831.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T00-18-19.073831.parquet"]}]}]}
|
2023-10-18T23:18:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-70m-deduped
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-70m-deduped on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T00:18:19.073831(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-70m-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-70m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T00:18:19.073831(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-70m-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-70m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T00:18:19.073831(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-70m-deduped## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-70m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T00:18:19.073831(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
59be791c9d6a7d48ae862f20a057fee31990dbb3
|
# Dataset Card for Evaluation run of EleutherAI/pythia-12b-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-12b-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-12b-deduped](https://huggingface.co/EleutherAI/pythia-12b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-12b-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T20:55:12.299775](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-12b-deduped/blob/main/results_2023-10-21T20-55-12.299775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.04548238255033574,
"f1_stderr": 0.0011460514648967963,
"acc": 0.3394834047701824,
"acc_stderr": 0.008275815910994185
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.04548238255033574,
"f1_stderr": 0.0011460514648967963
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.003282055917136946
},
"harness|winogrande|5": {
"acc": 0.664561957379637,
"acc_stderr": 0.013269575904851425
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-12b-deduped
|
[
"region:us"
] |
2023-08-17T22:46:31+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-12b-deduped", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-12b-deduped](https://huggingface.co/EleutherAI/pythia-12b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-12b-deduped\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T20:55:12.299775](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-12b-deduped/blob/main/results_2023-10-21T20-55-12.299775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801232,\n \"f1\": 0.04548238255033574,\n \"f1_stderr\": 0.0011460514648967963,\n \"acc\": 0.3394834047701824,\n \"acc_stderr\": 0.008275815910994185\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801232,\n \"f1\": 0.04548238255033574,\n \"f1_stderr\": 0.0011460514648967963\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.003282055917136946\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.664561957379637,\n \"acc_stderr\": 0.013269575904851425\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-12b-deduped", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T20_55_12.299775", "path": ["**/details_harness|drop|3_2023-10-21T20-55-12.299775.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T20-55-12.299775.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T20_55_12.299775", "path": ["**/details_harness|gsm8k|5_2023-10-21T20-55-12.299775.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T20-55-12.299775.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:15:42.026882.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:15:42.026882.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:15:42.026882.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T20_55_12.299775", "path": ["**/details_harness|winogrande|5_2023-10-21T20-55-12.299775.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T20-55-12.299775.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_15_42.026882", "path": ["results_2023-07-19T18:15:42.026882.parquet"]}, {"split": "2023_10_21T20_55_12.299775", "path": ["results_2023-10-21T20-55-12.299775.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T20-55-12.299775.parquet"]}]}]}
|
2023-10-21T19:55:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-12b-deduped
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-12b-deduped on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T20:55:12.299775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-12b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-12b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T20:55:12.299775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-12b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-12b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T20:55:12.299775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-12b-deduped## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-12b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T20:55:12.299775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ba5fb098e9c994f2265a9fd65449ab3c5bf272f1
|
# Dataset Card for Evaluation run of EleutherAI/pythia-2.7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-2.7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-2.7b](https://huggingface.co/EleutherAI/pythia-2.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-2.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T09:56:45.840530](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-2.7b/blob/main/results_2023-10-13T09-56-45.840530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219123,
"f1": 0.045142617449664524,
"f1_stderr": 0.0011715787583837264,
"acc": 0.315883214963382,
"acc_stderr": 0.008228218962784018
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219123,
"f1": 0.045142617449664524,
"f1_stderr": 0.0011715787583837264
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.002822713322387704
},
"harness|winogrande|5": {
"acc": 0.6211523283346487,
"acc_stderr": 0.013633724603180334
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-2.7b
|
[
"region:us"
] |
2023-08-17T22:46:39+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-2.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-2.7b](https://huggingface.co/EleutherAI/pythia-2.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-2.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T09:56:45.840530](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-2.7b/blob/main/results_2023-10-13T09-56-45.840530.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219123,\n \"f1\": 0.045142617449664524,\n \"f1_stderr\": 0.0011715787583837264,\n \"acc\": 0.315883214963382,\n \"acc_stderr\": 0.008228218962784018\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219123,\n \"f1\": 0.045142617449664524,\n \"f1_stderr\": 0.0011715787583837264\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.002822713322387704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6211523283346487,\n \"acc_stderr\": 0.013633724603180334\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-2.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T09_56_45.840530", "path": ["**/details_harness|drop|3_2023-10-13T09-56-45.840530.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T09-56-45.840530.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T09_56_45.840530", "path": ["**/details_harness|gsm8k|5_2023-10-13T09-56-45.840530.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T09-56-45.840530.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:50:21.612353.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:50:21.612353.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:50:21.612353.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T09_56_45.840530", "path": ["**/details_harness|winogrande|5_2023-10-13T09-56-45.840530.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T09-56-45.840530.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_50_21.612353", "path": ["results_2023-07-19T16:50:21.612353.parquet"]}, {"split": "2023_10_13T09_56_45.840530", "path": ["results_2023-10-13T09-56-45.840530.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T09-56-45.840530.parquet"]}]}]}
|
2023-10-13T08:56:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-2.7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-2.7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-13T09:56:45.840530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-2.7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T09:56:45.840530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-2.7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T09:56:45.840530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-2.7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-2.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T09:56:45.840530(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
da3a2ffe5a763a95c286f24ea11850ce77a8e0fb
|
# Dataset Card for Evaluation run of EleutherAI/pythia-2.8b-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-2.8b-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-2.8b-deduped](https://huggingface.co/EleutherAI/pythia-2.8b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T02:23:42.600907](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped/blob/main/results_2023-10-22T02-23-42.600907.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190916,
"f1": 0.0446549916107384,
"f1_stderr": 0.0011620582208289672,
"acc": 0.30527479800116447,
"acc_stderr": 0.008130342870304771
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190916,
"f1": 0.0446549916107384,
"f1_stderr": 0.0011620582208289672
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.002504942226860519
},
"harness|winogrande|5": {
"acc": 0.6022099447513812,
"acc_stderr": 0.013755743513749023
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped
|
[
"region:us"
] |
2023-08-17T22:46:48+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-2.8b-deduped", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-2.8b-deduped](https://huggingface.co/EleutherAI/pythia-2.8b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T02:23:42.600907](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-2.8b-deduped/blob/main/results_2023-10-22T02-23-42.600907.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931190916,\n \"f1\": 0.0446549916107384,\n \"f1_stderr\": 0.0011620582208289672,\n \"acc\": 0.30527479800116447,\n \"acc_stderr\": 0.008130342870304771\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931190916,\n \"f1\": 0.0446549916107384,\n \"f1_stderr\": 0.0011620582208289672\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \"acc_stderr\": 0.002504942226860519\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6022099447513812,\n \"acc_stderr\": 0.013755743513749023\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-2.8b-deduped", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T02_23_42.600907", "path": ["**/details_harness|drop|3_2023-10-22T02-23-42.600907.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T02-23-42.600907.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T02_23_42.600907", "path": ["**/details_harness|gsm8k|5_2023-10-22T02-23-42.600907.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T02-23-42.600907.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:26:01.712520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:26:01.712520.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:26:01.712520.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T02_23_42.600907", "path": ["**/details_harness|winogrande|5_2023-10-22T02-23-42.600907.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T02-23-42.600907.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_26_01.712520", "path": ["results_2023-07-19T17:26:01.712520.parquet"]}, {"split": "2023_10_22T02_23_42.600907", "path": ["results_2023-10-22T02-23-42.600907.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T02-23-42.600907.parquet"]}]}]}
|
2023-10-22T01:23:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-2.8b-deduped
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-2.8b-deduped on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T02:23:42.600907(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-2.8b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-2.8b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T02:23:42.600907(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-2.8b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-2.8b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T02:23:42.600907(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-2.8b-deduped## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-2.8b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T02:23:42.600907(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9da4eafb64a389c30fcc2caa18bed674bc3a0b08
|
# Dataset Card for Evaluation run of EleutherAI/pythia-6.7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-6.7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-6.7b](https://huggingface.co/EleutherAI/pythia-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-6.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T21:18:46.645949](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-6.7b/blob/main/results_2023-10-21T21-18-46.645949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968570957,
"f1": 0.04782403523489941,
"f1_stderr": 0.001192823686148428,
"acc": 0.3289061036768785,
"acc_stderr": 0.008126220712088333
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968570957,
"f1": 0.04782403523489941,
"f1_stderr": 0.001192823686148428
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.0028227133223877035
},
"harness|winogrande|5": {
"acc": 0.6471981057616417,
"acc_stderr": 0.013429728101788961
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-6.7b
|
[
"region:us"
] |
2023-08-17T22:46:57+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-6.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-6.7b](https://huggingface.co/EleutherAI/pythia-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-6.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T21:18:46.645949](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-6.7b/blob/main/results_2023-10-21T21-18-46.645949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968570957,\n \"f1\": 0.04782403523489941,\n \"f1_stderr\": 0.001192823686148428,\n \"acc\": 0.3289061036768785,\n \"acc_stderr\": 0.008126220712088333\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968570957,\n \"f1\": 0.04782403523489941,\n \"f1_stderr\": 0.001192823686148428\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.0028227133223877035\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6471981057616417,\n \"acc_stderr\": 0.013429728101788961\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-6.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T21_18_46.645949", "path": ["**/details_harness|drop|3_2023-10-21T21-18-46.645949.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T21-18-46.645949.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T21_18_46.645949", "path": ["**/details_harness|gsm8k|5_2023-10-21T21-18-46.645949.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T21-18-46.645949.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:34:10.394938.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:34:10.394938.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:34:10.394938.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T21_18_46.645949", "path": ["**/details_harness|winogrande|5_2023-10-21T21-18-46.645949.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T21-18-46.645949.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_34_10.394938", "path": ["results_2023-07-19T17:34:10.394938.parquet"]}, {"split": "2023_10_21T21_18_46.645949", "path": ["results_2023-10-21T21-18-46.645949.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T21-18-46.645949.parquet"]}]}]}
|
2023-10-21T20:18:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-6.7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-6.7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T21:18:46.645949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-6.7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T21:18:46.645949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-6.7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T21:18:46.645949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-6.7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T21:18:46.645949(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
30c73ec1ece66ffcb060174c85ed8b26d3458522
|
# Dataset Card for Evaluation run of EleutherAI/pythia-160m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-160m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-160m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T00:42:11.960734](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-160m/blob/main/results_2023-10-19T00-42-11.960734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119131,
"f1": 0.03449874161073832,
"f1_stderr": 0.0010696643616809897,
"acc": 0.2588325685012862,
"acc_stderr": 0.007678992302325538
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119131,
"f1": 0.03449874161073832,
"f1_stderr": 0.0010696643616809897
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674164
},
"harness|winogrande|5": {
"acc": 0.5153906866614049,
"acc_stderr": 0.01404582678978366
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-160m
|
[
"region:us"
] |
2023-08-17T22:47:05+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-160m", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-160m](https://huggingface.co/EleutherAI/pythia-160m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-160m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T00:42:11.960734](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-160m/blob/main/results_2023-10-19T00-42-11.960734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119131,\n \"f1\": 0.03449874161073832,\n \"f1_stderr\": 0.0010696643616809897,\n \"acc\": 0.2588325685012862,\n \"acc_stderr\": 0.007678992302325538\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119131,\n \"f1\": 0.03449874161073832,\n \"f1_stderr\": 0.0010696643616809897\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674164\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5153906866614049,\n \"acc_stderr\": 0.01404582678978366\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-160m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T00_42_11.960734", "path": ["**/details_harness|drop|3_2023-10-19T00-42-11.960734.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T00-42-11.960734.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T00_42_11.960734", "path": ["**/details_harness|gsm8k|5_2023-10-19T00-42-11.960734.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T00-42-11.960734.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:14.258064.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:01:14.258064.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:01:14.258064.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T00_42_11.960734", "path": ["**/details_harness|winogrande|5_2023-10-19T00-42-11.960734.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T00-42-11.960734.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_01_14.258064", "path": ["results_2023-07-19T14:01:14.258064.parquet"]}, {"split": "2023_10_19T00_42_11.960734", "path": ["results_2023-10-19T00-42-11.960734.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T00-42-11.960734.parquet"]}]}]}
|
2023-10-18T23:42:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-160m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-160m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T00:42:11.960734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-160m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-160m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T00:42:11.960734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-160m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-160m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T00:42:11.960734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-160m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-160m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T00:42:11.960734(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
26fd880b3e9e16cd7464d3a41c0fca45b8461838
|
# Dataset Card for Evaluation run of EleutherAI/pythia-70m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-70m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-70m](https://huggingface.co/EleutherAI/pythia-70m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-70m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T22:56:50.140170](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-70m/blob/main/results_2023-10-21T22-56-50.140170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004404362416107382,
"em_stderr": 0.0006781451620479578,
"f1": 0.03329383389261746,
"f1_stderr": 0.001229822313283723,
"acc": 0.25881701056682943,
"acc_stderr": 0.0077805329722501985
},
"harness|drop|3": {
"em": 0.004404362416107382,
"em_stderr": 0.0006781451620479578,
"f1": 0.03329383389261746,
"f1_stderr": 0.001229822313283723
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245575
},
"harness|winogrande|5": {
"acc": 0.5146014206787688,
"acc_stderr": 0.014046492383275839
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-70m
|
[
"region:us"
] |
2023-08-17T22:47:14+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-70m", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-70m](https://huggingface.co/EleutherAI/pythia-70m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-70m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T22:56:50.140170](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-70m/blob/main/results_2023-10-21T22-56-50.140170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479578,\n \"f1\": 0.03329383389261746,\n \"f1_stderr\": 0.001229822313283723,\n \"acc\": 0.25881701056682943,\n \"acc_stderr\": 0.0077805329722501985\n },\n \"harness|drop|3\": {\n \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479578,\n \"f1\": 0.03329383389261746,\n \"f1_stderr\": 0.001229822313283723\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245575\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5146014206787688,\n \"acc_stderr\": 0.014046492383275839\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-70m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T22_56_50.140170", "path": ["**/details_harness|drop|3_2023-10-21T22-56-50.140170.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T22-56-50.140170.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T22_56_50.140170", "path": ["**/details_harness|gsm8k|5_2023-10-21T22-56-50.140170.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T22-56-50.140170.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:39:51.467973.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:39:51.467973.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:39:51.467973.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T22_56_50.140170", "path": ["**/details_harness|winogrande|5_2023-10-21T22-56-50.140170.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T22-56-50.140170.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_39_51.467973", "path": ["results_2023-07-19T13:39:51.467973.parquet"]}, {"split": "2023_10_21T22_56_50.140170", "path": ["results_2023-10-21T22-56-50.140170.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T22-56-50.140170.parquet"]}]}]}
|
2023-10-21T21:57:02+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-70m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-70m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T22:56:50.140170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-70m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-70m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T22:56:50.140170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-70m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-70m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T22:56:50.140170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-70m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-70m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T22:56:50.140170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
6ee356d4954362631fdd03c2499b62d6a88248bd
|
# Dataset Card for Evaluation run of EleutherAI/polyglot-ko-12.8b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/polyglot-ko-12.8b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__polyglot-ko-12.8b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T02:17:54.630291](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__polyglot-ko-12.8b/blob/main/results_2023-10-19T02-17-54.630291.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.04268036912751678,
"em_stderr": 0.0020700565850232436,
"f1": 0.09065960570469792,
"f1_stderr": 0.002370421899236817,
"acc": 0.2994953245415047,
"acc_stderr": 0.0074273230901261535
},
"harness|drop|3": {
"em": 0.04268036912751678,
"em_stderr": 0.0020700565850232436,
"f1": 0.09065960570469792,
"f1_stderr": 0.002370421899236817
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492619
},
"harness|winogrande|5": {
"acc": 0.5974743488555643,
"acc_stderr": 0.013782866831703044
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__polyglot-ko-12.8b
|
[
"region:us"
] |
2023-08-17T22:47:23+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/polyglot-ko-12.8b", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__polyglot-ko-12.8b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T02:17:54.630291](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__polyglot-ko-12.8b/blob/main/results_2023-10-19T02-17-54.630291.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04268036912751678,\n \"em_stderr\": 0.0020700565850232436,\n \"f1\": 0.09065960570469792,\n \"f1_stderr\": 0.002370421899236817,\n \"acc\": 0.2994953245415047,\n \"acc_stderr\": 0.0074273230901261535\n },\n \"harness|drop|3\": {\n \"em\": 0.04268036912751678,\n \"em_stderr\": 0.0020700565850232436,\n \"f1\": 0.09065960570469792,\n \"f1_stderr\": 0.002370421899236817\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492619\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5974743488555643,\n \"acc_stderr\": 0.013782866831703044\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/polyglot-ko-12.8b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T02_17_54.630291", "path": ["**/details_harness|drop|3_2023-10-19T02-17-54.630291.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T02-17-54.630291.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T02_17_54.630291", "path": ["**/details_harness|gsm8k|5_2023-10-19T02-17-54.630291.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T02-17-54.630291.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:43:02.018732.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:43:02.018732.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:43:02.018732.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T02_17_54.630291", "path": ["**/details_harness|winogrande|5_2023-10-19T02-17-54.630291.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T02-17-54.630291.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_43_02.018732", "path": ["results_2023-07-19T18:43:02.018732.parquet"]}, {"split": "2023_10_19T02_17_54.630291", "path": ["results_2023-10-19T02-17-54.630291.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T02-17-54.630291.parquet"]}]}]}
|
2023-10-19T01:18:08+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/polyglot-ko-12.8b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/polyglot-ko-12.8b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T02:17:54.630291(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/polyglot-ko-12.8b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/polyglot-ko-12.8b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T02:17:54.630291(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/polyglot-ko-12.8b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/polyglot-ko-12.8b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T02:17:54.630291(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/polyglot-ko-12.8b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/polyglot-ko-12.8b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T02:17:54.630291(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
5c0474146c8f21c90d56f10e8bd45acda803611a
|
# Dataset Card for Evaluation run of EleutherAI/pythia-6.9b-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-6.9b-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-6.9b-deduped](https://huggingface.co/EleutherAI/pythia-6.9b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T01:47:10.144336](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped/blob/main/results_2023-10-22T01-47-10.144336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335642,
"f1": 0.04495805369127533,
"f1_stderr": 0.0011424943224633687,
"acc": 0.32878164020122397,
"acc_stderr": 0.008505355545421337
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335642,
"f1": 0.04495805369127533,
"f1_stderr": 0.0011424943224633687
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.003527595888722438
},
"harness|winogrande|5": {
"acc": 0.6408839779005525,
"acc_stderr": 0.013483115202120236
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped
|
[
"region:us"
] |
2023-08-17T22:47:32+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-6.9b-deduped", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-6.9b-deduped](https://huggingface.co/EleutherAI/pythia-6.9b-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T01:47:10.144336](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-6.9b-deduped/blob/main/results_2023-10-22T01-47-10.144336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335642,\n \"f1\": 0.04495805369127533,\n \"f1_stderr\": 0.0011424943224633687,\n \"acc\": 0.32878164020122397,\n \"acc_stderr\": 0.008505355545421337\n },\n \"harness|drop|3\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335642,\n \"f1\": 0.04495805369127533,\n \"f1_stderr\": 0.0011424943224633687\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \"acc_stderr\": 0.003527595888722438\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120236\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-6.9b-deduped", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T01_47_10.144336", "path": ["**/details_harness|drop|3_2023-10-22T01-47-10.144336.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T01-47-10.144336.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T01_47_10.144336", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-47-10.144336.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-47-10.144336.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:40:55.095296.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:40:55.095296.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:40:55.095296.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T01_47_10.144336", "path": ["**/details_harness|winogrande|5_2023-10-22T01-47-10.144336.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T01-47-10.144336.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_40_55.095296", "path": ["results_2023-07-19T17:40:55.095296.parquet"]}, {"split": "2023_10_22T01_47_10.144336", "path": ["results_2023-10-22T01-47-10.144336.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T01-47-10.144336.parquet"]}]}]}
|
2023-10-22T00:47:22+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-6.9b-deduped
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-6.9b-deduped on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T01:47:10.144336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-6.9b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-6.9b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T01:47:10.144336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-6.9b-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-6.9b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T01:47:10.144336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-6.9b-deduped## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-6.9b-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T01:47:10.144336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
009fb4a387b94a9a5bb13af3b4b8e19e1693b8a1
|
# Dataset Card for Evaluation run of EleutherAI/gpt-neox-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/gpt-neox-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__gpt-neox-20b",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T17:14:42.607420](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neox-20b/blob/main/results_2023-12-03T17-14-42.607420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.05458680818802123,
"acc_stderr": 0.00625744403791253
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.00625744403791253
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__gpt-neox-20b
|
[
"region:us"
] |
2023-08-17T22:47:41+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/gpt-neox-20b", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__gpt-neox-20b\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T17:14:42.607420](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neox-20b/blob/main/results_2023-12-03T17-14-42.607420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.05458680818802123,\n \"acc_stderr\": 0.00625744403791253\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05458680818802123,\n \"acc_stderr\": 0.00625744403791253\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/gpt-neox-20b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_08T18_29_20.429481", "path": ["**/details_harness|drop|3_2023-09-08T18-29-20.429481.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-08T18-29-20.429481.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_08T18_29_20.429481", "path": ["**/details_harness|gsm8k|5_2023-09-08T18-29-20.429481.parquet"]}, {"split": "2023_12_03T17_14_42.607420", "path": ["**/details_harness|gsm8k|5_2023-12-03T17-14-42.607420.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T17-14-42.607420.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:44:54.391639.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:44:54.391639.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:44:54.391639.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_08T18_29_20.429481", "path": ["**/details_harness|winogrande|5_2023-09-08T18-29-20.429481.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-08T18-29-20.429481.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:20:23.118147.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_20_23.118147", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:20:23.118147.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:20:23.118147.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T10_44_54.391639", "path": ["results_2023-07-20T10:44:54.391639.parquet"]}, {"split": "2023_08_28T20_20_23.118147", "path": ["results_2023-08-28T20:20:23.118147.parquet"]}, {"split": "2023_09_08T18_29_20.429481", "path": ["results_2023-09-08T18-29-20.429481.parquet"]}, {"split": "2023_12_03T17_14_42.607420", "path": ["results_2023-12-03T17-14-42.607420.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T17-14-42.607420.parquet"]}]}]}
|
2023-12-03T17:14:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/gpt-neox-20b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/gpt-neox-20b on the Open LLM Leaderboard.
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-03T17:14:42.607420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/gpt-neox-20b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neox-20b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T17:14:42.607420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/gpt-neox-20b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neox-20b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T17:14:42.607420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
170,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/gpt-neox-20b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neox-20b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T17:14:42.607420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
41d506d838a65b49ae4d75768d1206a9fe58d88b
|
# Dataset Card for Evaluation run of EleutherAI/pythia-12b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-12b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-12b](https://huggingface.co/EleutherAI/pythia-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-12b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T13:49:53.203420](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-12b/blob/main/results_2023-10-12T13-49-53.203420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723885,
"f1": 0.04447986577181216,
"f1_stderr": 0.0010992181181045415,
"acc": 0.32955534824940325,
"acc_stderr": 0.008541034020282903
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749723885,
"f1": 0.04447986577181216,
"f1_stderr": 0.0010992181181045415
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.003605486867998272
},
"harness|winogrande|5": {
"acc": 0.6416732438831886,
"acc_stderr": 0.013476581172567535
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-12b
|
[
"region:us"
] |
2023-08-17T22:47:50+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-12b", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-12b](https://huggingface.co/EleutherAI/pythia-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-12b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T13:49:53.203420](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-12b/blob/main/results_2023-10-12T13-49-53.203420.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.0002568002749723885,\n \"f1\": 0.04447986577181216,\n \"f1_stderr\": 0.0010992181181045415,\n \"acc\": 0.32955534824940325,\n \"acc_stderr\": 0.008541034020282903\n },\n \"harness|drop|3\": {\n \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.0002568002749723885,\n \"f1\": 0.04447986577181216,\n \"f1_stderr\": 0.0010992181181045415\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \"acc_stderr\": 0.003605486867998272\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6416732438831886,\n \"acc_stderr\": 0.013476581172567535\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-12b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|arc:challenge|25_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T13_49_53.203420", "path": ["**/details_harness|drop|3_2023-10-12T13-49-53.203420.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T13-49-53.203420.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T13_49_53.203420", "path": ["**/details_harness|gsm8k|5_2023-10-12T13-49-53.203420.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T13-49-53.203420.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hellaswag|10_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:06:28.460226.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-23T22:15:03.187761.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-23T22:15:03.187761.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-23T22:15:03.187761.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T13_49_53.203420", "path": ["**/details_harness|winogrande|5_2023-10-12T13-49-53.203420.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T13-49-53.203420.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:25:00.431107.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_25_00.431107", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:25:00.431107.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:25:00.431107.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_06_28.460226", "path": ["results_2023-07-19T18:06:28.460226.parquet"]}, {"split": "2023_08_23T22_15_03.187761", "path": ["results_2023-08-23T22:15:03.187761.parquet"]}, {"split": "2023_08_28T20_25_00.431107", "path": ["results_2023-08-28T20:25:00.431107.parquet"]}, {"split": "2023_10_12T13_49_53.203420", "path": ["results_2023-10-12T13-49-53.203420.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T13-49-53.203420.parquet"]}]}]}
|
2023-10-12T12:50:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-12b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-12b on the Open LLM Leaderboard.
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-12T13:49:53.203420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-12b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-12b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T13:49:53.203420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-12b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-12b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T13:49:53.203420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-12b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-12b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T13:49:53.203420(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
401223c80b49c2c0fc6e2c286d852db95b25ef09
|
# Dataset Card for Evaluation run of EleutherAI/pythia-160m-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-160m-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-160m-deduped](https://huggingface.co/EleutherAI/pythia-160m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T14:10:15.721061](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped/blob/main/results_2023-10-18T14-10-15.721061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436387,
"f1": 0.033831795302013495,
"f1_stderr": 0.0011064778180343976,
"acc": 0.2580433025186501,
"acc_stderr": 0.007679640365653923
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436387,
"f1": 0.033831795302013495,
"f1_stderr": 0.0011064778180343976
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674233
},
"harness|winogrande|5": {
"acc": 0.5138121546961326,
"acc_stderr": 0.014047122916440422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped
|
[
"region:us"
] |
2023-08-17T22:47:58+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-160m-deduped", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-160m-deduped](https://huggingface.co/EleutherAI/pythia-160m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T14:10:15.721061](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-160m-deduped/blob/main/results_2023-10-18T14-10-15.721061.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436387,\n \"f1\": 0.033831795302013495,\n \"f1_stderr\": 0.0011064778180343976,\n \"acc\": 0.2580433025186501,\n \"acc_stderr\": 0.007679640365653923\n },\n \"harness|drop|3\": {\n \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436387,\n \"f1\": 0.033831795302013495,\n \"f1_stderr\": 0.0011064778180343976\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674233\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5138121546961326,\n \"acc_stderr\": 0.014047122916440422\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-160m-deduped", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T14_10_15.721061", "path": ["**/details_harness|drop|3_2023-10-18T14-10-15.721061.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T14-10-15.721061.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T14_10_15.721061", "path": ["**/details_harness|gsm8k|5_2023-10-18T14-10-15.721061.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T14-10-15.721061.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:37.454131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:01:37.454131.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:01:37.454131.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T14_10_15.721061", "path": ["**/details_harness|winogrande|5_2023-10-18T14-10-15.721061.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T14-10-15.721061.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_01_37.454131", "path": ["results_2023-07-19T14:01:37.454131.parquet"]}, {"split": "2023_10_18T14_10_15.721061", "path": ["results_2023-10-18T14-10-15.721061.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T14-10-15.721061.parquet"]}]}]}
|
2023-10-18T13:10:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-160m-deduped
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-160m-deduped on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T14:10:15.721061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-160m-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-160m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T14:10:15.721061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-160m-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-160m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T14:10:15.721061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-160m-deduped## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-160m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T14:10:15.721061(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b4b2536b8e6bceb8b1783f7bcd6581e231143c69
|
# Dataset Card for Evaluation run of EleutherAI/pythia-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-1.3b](https://huggingface.co/EleutherAI/pythia-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T20:31:22.068379](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-1.3b/blob/main/results_2023-10-21T20-31-22.068379.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219287,
"f1": 0.040563129194630954,
"f1_stderr": 0.0011177096979539825,
"acc": 0.29182616042743625,
"acc_stderr": 0.008309831271227
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219287,
"f1": 0.040563129194630954,
"f1_stderr": 0.0011177096979539825
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.00272107657704166
},
"harness|winogrande|5": {
"acc": 0.5737963693764798,
"acc_stderr": 0.013898585965412338
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-1.3b
|
[
"region:us"
] |
2023-08-17T22:48:07+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-1.3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-1.3b](https://huggingface.co/EleutherAI/pythia-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-1.3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T20:31:22.068379](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-1.3b/blob/main/results_2023-10-21T20-31-22.068379.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219287,\n \"f1\": 0.040563129194630954,\n \"f1_stderr\": 0.0011177096979539825,\n \"acc\": 0.29182616042743625,\n \"acc_stderr\": 0.008309831271227\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219287,\n \"f1\": 0.040563129194630954,\n \"f1_stderr\": 0.0011177096979539825\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.00272107657704166\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5737963693764798,\n \"acc_stderr\": 0.013898585965412338\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-1.3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T20_31_22.068379", "path": ["**/details_harness|drop|3_2023-10-21T20-31-22.068379.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T20-31-22.068379.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T20_31_22.068379", "path": ["**/details_harness|gsm8k|5_2023-10-21T20-31-22.068379.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T20-31-22.068379.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:01:09.572948.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:01:09.572948.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:01:09.572948.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T20_31_22.068379", "path": ["**/details_harness|winogrande|5_2023-10-21T20-31-22.068379.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T20-31-22.068379.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_01_09.572948", "path": ["results_2023-07-19T15:01:09.572948.parquet"]}, {"split": "2023_10_21T20_31_22.068379", "path": ["results_2023-10-21T20-31-22.068379.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T20-31-22.068379.parquet"]}]}]}
|
2023-10-21T19:31:34+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-1.3b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-1.3b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T20:31:22.068379(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-1.3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T20:31:22.068379(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-1.3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T20:31:22.068379(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-1.3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T20:31:22.068379(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d7a94c6a73652919b68f419984d49681aec3ce82
|
# Dataset Card for Evaluation run of EleutherAI/gpt-j-6b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/gpt-j-6b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/gpt-j-6b](https://huggingface.co/EleutherAI/gpt-j-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__gpt-j-6b",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T16:52:26.173919](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-j-6b/blob/main/results_2023-12-03T16-52-26.173919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.029567854435178165,
"acc_stderr": 0.004665893134220772
},
"harness|gsm8k|5": {
"acc": 0.029567854435178165,
"acc_stderr": 0.004665893134220772
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__gpt-j-6b
|
[
"region:us"
] |
2023-08-17T22:48:16+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/gpt-j-6b", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/gpt-j-6b](https://huggingface.co/EleutherAI/gpt-j-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__gpt-j-6b\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T16:52:26.173919](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-j-6b/blob/main/results_2023-12-03T16-52-26.173919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.029567854435178165,\n \"acc_stderr\": 0.004665893134220772\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.029567854435178165,\n \"acc_stderr\": 0.004665893134220772\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/gpt-j-6b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|arc:challenge|25_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|arc:challenge|25_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_08T17_46_12.907701", "path": ["**/details_harness|drop|3_2023-09-08T17-46-12.907701.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-08T17-46-12.907701.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_08T17_46_12.907701", "path": ["**/details_harness|gsm8k|5_2023-09-08T17-46-12.907701.parquet"]}, {"split": "2023_12_03T16_51_11.923338", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-51-11.923338.parquet"]}, {"split": "2023_12_03T16_52_26.173919", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-52-26.173919.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-52-26.173919.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hellaswag|10_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hellaswag|10_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:28:34.858547.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:47:17.854530.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-29T19:41:28.653242.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-21T23-10-49.133869.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T23-10-49.133869.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-21T23-10-49.133869.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_08T17_46_12.907701", "path": ["**/details_harness|winogrande|5_2023-09-08T17-46-12.907701.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-08T17-46-12.907701.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:18:18.137533.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_18_18.137533", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:18:18.137533.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:18:18.137533.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_28_34.858547", "path": ["results_2023-07-18T11:28:34.858547.parquet"]}, {"split": "2023_07_19T10_47_17.854530", "path": ["results_2023-07-19T10:47:17.854530.parquet"]}, {"split": "2023_08_28T20_18_18.137533", "path": ["results_2023-08-28T20:18:18.137533.parquet"]}, {"split": "2023_08_29T19_41_28.653242", "path": ["results_2023-08-29T19:41:28.653242.parquet"]}, {"split": "2023_09_08T17_46_12.907701", "path": ["results_2023-09-08T17-46-12.907701.parquet"]}, {"split": "2023_09_21T23_10_49.133869", "path": ["results_2023-09-21T23-10-49.133869.parquet"]}, {"split": "2023_12_03T16_51_11.923338", "path": ["results_2023-12-03T16-51-11.923338.parquet"]}, {"split": "2023_12_03T16_52_26.173919", "path": ["results_2023-12-03T16-52-26.173919.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T16-52-26.173919.parquet"]}]}]}
|
2023-12-03T16:52:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/gpt-j-6b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/gpt-j-6b on the Open LLM Leaderboard.
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-03T16:52:26.173919(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/gpt-j-6b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-j-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T16:52:26.173919(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/gpt-j-6b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-j-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T16:52:26.173919(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/gpt-j-6b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-j-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T16:52:26.173919(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
17fc8e5865cba92a8b792cbd7c6d9ee817711ce9
|
# Dataset Card for Evaluation run of EleutherAI/gpt-neo-2.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/gpt-neo-2.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neo-2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T18:17:27.118418](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B/blob/main/results_2023-09-16T18-17-27.118418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460643,
"f1": 0.04774853187919481,
"f1_stderr": 0.0012502430800989544,
"acc": 0.3067599823596958,
"acc_stderr": 0.008435917406608623
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460643,
"f1": 0.04774853187919481,
"f1_stderr": 0.0012502430800989544
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.003106901266499639
},
"harness|winogrande|5": {
"acc": 0.6006314127861089,
"acc_stderr": 0.013764933546717609
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B
|
[
"region:us"
] |
2023-08-17T22:48:27+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/gpt-neo-2.7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/gpt-neo-2.7B](https://huggingface.co/EleutherAI/gpt-neo-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T18:17:27.118418](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__gpt-neo-2.7B/blob/main/results_2023-09-16T18-17-27.118418.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460643,\n \"f1\": 0.04774853187919481,\n \"f1_stderr\": 0.0012502430800989544,\n \"acc\": 0.3067599823596958,\n \"acc_stderr\": 0.008435917406608623\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460643,\n \"f1\": 0.04774853187919481,\n \"f1_stderr\": 0.0012502430800989544\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \"acc_stderr\": 0.003106901266499639\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6006314127861089,\n \"acc_stderr\": 0.013764933546717609\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/gpt-neo-2.7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T18_17_27.118418", "path": ["**/details_harness|drop|3_2023-09-16T18-17-27.118418.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T18-17-27.118418.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T18_17_27.118418", "path": ["**/details_harness|gsm8k|5_2023-09-16T18-17-27.118418.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T18-17-27.118418.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:18:37.000373.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:18:37.000373.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:18:37.000373.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T18_17_27.118418", "path": ["**/details_harness|winogrande|5_2023-09-16T18-17-27.118418.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T18-17-27.118418.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_18_37.000373", "path": ["results_2023-07-19T17:18:37.000373.parquet"]}, {"split": "2023_09_16T18_17_27.118418", "path": ["results_2023-09-16T18-17-27.118418.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T18-17-27.118418.parquet"]}]}]}
|
2023-09-16T17:17:38+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/gpt-neo-2.7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/gpt-neo-2.7B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-16T18:17:27.118418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/gpt-neo-2.7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-2.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T18:17:27.118418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/gpt-neo-2.7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-2.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T18:17:27.118418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/gpt-neo-2.7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/gpt-neo-2.7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T18:17:27.118418(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f7c7b91f1e8d03427a5f6f730811e8295ea3aba3
|
# Dataset Card for Evaluation run of EleutherAI/pythia-410m-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/EleutherAI/pythia-410m-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [EleutherAI/pythia-410m-deduped](https://huggingface.co/EleutherAI/pythia-410m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T09:27:36.064128](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped/blob/main/results_2023-10-18T09-27-36.064128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298293,
"f1": 0.042572357382550524,
"f1_stderr": 0.0011637772390608397,
"acc": 0.27341843124559817,
"acc_stderr": 0.007756513586074438
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298293,
"f1": 0.042572357382550524,
"f1_stderr": 0.0011637772390608397
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245436
},
"harness|winogrande|5": {
"acc": 0.5438042620363063,
"acc_stderr": 0.013998453610924331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped
|
[
"region:us"
] |
2023-08-17T22:48:36+00:00
|
{"pretty_name": "Evaluation run of EleutherAI/pythia-410m-deduped", "dataset_summary": "Dataset automatically created during the evaluation run of model [EleutherAI/pythia-410m-deduped](https://huggingface.co/EleutherAI/pythia-410m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T09:27:36.064128](https://huggingface.co/datasets/open-llm-leaderboard/details_EleutherAI__pythia-410m-deduped/blob/main/results_2023-10-18T09-27-36.064128.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298293,\n \"f1\": 0.042572357382550524,\n \"f1_stderr\": 0.0011637772390608397,\n \"acc\": 0.27341843124559817,\n \"acc_stderr\": 0.007756513586074438\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298293,\n \"f1\": 0.042572357382550524,\n \"f1_stderr\": 0.0011637772390608397\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245436\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5438042620363063,\n \"acc_stderr\": 0.013998453610924331\n }\n}\n```", "repo_url": "https://huggingface.co/EleutherAI/pythia-410m-deduped", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T09_27_36.064128", "path": ["**/details_harness|drop|3_2023-10-18T09-27-36.064128.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T09-27-36.064128.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T09_27_36.064128", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-27-36.064128.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-27-36.064128.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:23:02.980263.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:23:02.980263.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:23:02.980263.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T09_27_36.064128", "path": ["**/details_harness|winogrande|5_2023-10-18T09-27-36.064128.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T09-27-36.064128.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_23_02.980263", "path": ["results_2023-07-19T14:23:02.980263.parquet"]}, {"split": "2023_10_18T09_27_36.064128", "path": ["results_2023-10-18T09-27-36.064128.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T09-27-36.064128.parquet"]}]}]}
|
2023-10-18T08:27:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of EleutherAI/pythia-410m-deduped
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model EleutherAI/pythia-410m-deduped on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T09:27:36.064128(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of EleutherAI/pythia-410m-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-410m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T09:27:36.064128(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EleutherAI/pythia-410m-deduped",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-410m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T09:27:36.064128(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EleutherAI/pythia-410m-deduped## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model EleutherAI/pythia-410m-deduped on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T09:27:36.064128(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
55e9400f6c11bc2357301ecb6167ae9df4bfbbf5
|
# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/shibing624/chinese-alpaca-plus-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [shibing624/chinese-alpaca-plus-13b-hf](https://huggingface.co/shibing624/chinese-alpaca-plus-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-13b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T22:17:42.526571](https://huggingface.co/datasets/open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-13b-hf/blob/main/results_2023-09-22T22-17-42.526571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.21413590604026847,
"em_stderr": 0.004201052628405755,
"f1": 0.28453125000000146,
"f1_stderr": 0.004252615634369811,
"acc": 0.3859100763356038,
"acc_stderr": 0.00806531916338966
},
"harness|drop|3": {
"em": 0.21413590604026847,
"em_stderr": 0.004201052628405755,
"f1": 0.28453125000000146,
"f1_stderr": 0.004252615634369811
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.012160189196930687
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-13b-hf
|
[
"region:us"
] |
2023-08-17T22:48:44+00:00
|
{"pretty_name": "Evaluation run of shibing624/chinese-alpaca-plus-13b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [shibing624/chinese-alpaca-plus-13b-hf](https://huggingface.co/shibing624/chinese-alpaca-plus-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-13b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T22:17:42.526571](https://huggingface.co/datasets/open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-13b-hf/blob/main/results_2023-09-22T22-17-42.526571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.21413590604026847,\n \"em_stderr\": 0.004201052628405755,\n \"f1\": 0.28453125000000146,\n \"f1_stderr\": 0.004252615634369811,\n \"acc\": 0.3859100763356038,\n \"acc_stderr\": 0.00806531916338966\n },\n \"harness|drop|3\": {\n \"em\": 0.21413590604026847,\n \"em_stderr\": 0.004201052628405755,\n \"f1\": 0.28453125000000146,\n \"f1_stderr\": 0.004252615634369811\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \"acc_stderr\": 0.003970449129848635\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930687\n }\n}\n```", "repo_url": "https://huggingface.co/shibing624/chinese-alpaca-plus-13b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T22_17_42.526571", "path": ["**/details_harness|drop|3_2023-09-22T22-17-42.526571.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T22-17-42.526571.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T22_17_42.526571", "path": ["**/details_harness|gsm8k|5_2023-09-22T22-17-42.526571.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T22-17-42.526571.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:40.370845.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:20:40.370845.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:20:40.370845.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T22_17_42.526571", "path": ["**/details_harness|winogrande|5_2023-09-22T22-17-42.526571.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T22-17-42.526571.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_20_40.370845", "path": ["results_2023-07-19T19:20:40.370845.parquet"]}, {"split": "2023_09_22T22_17_42.526571", "path": ["results_2023-09-22T22-17-42.526571.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T22-17-42.526571.parquet"]}]}]}
|
2023-09-22T21:17:54+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-13b-hf
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model shibing624/chinese-alpaca-plus-13b-hf on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T22:17:42.526571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-13b-hf",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-alpaca-plus-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T22:17:42.526571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-13b-hf",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-alpaca-plus-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T22:17:42.526571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-13b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-alpaca-plus-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T22:17:42.526571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c2f04d2f2342b04edbf53f175aeb647de542f5f3
|
# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-7b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/shibing624/chinese-alpaca-plus-7b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [shibing624/chinese-alpaca-plus-7b-hf](https://huggingface.co/shibing624/chinese-alpaca-plus-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-7b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T01:27:29.107339](https://huggingface.co/datasets/open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-7b-hf/blob/main/results_2023-10-18T01-27-29.107339.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.231753355704698,
"em_stderr": 0.004321186423348418,
"f1": 0.28607277684563825,
"f1_stderr": 0.004345067745668727,
"acc": 0.35384577180220117,
"acc_stderr": 0.007568088084173026
},
"harness|drop|3": {
"em": 0.231753355704698,
"em_stderr": 0.004321186423348418,
"f1": 0.28607277684563825,
"f1_stderr": 0.004345067745668727
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.002267537102254516
},
"harness|winogrande|5": {
"acc": 0.7008681925808997,
"acc_stderr": 0.012868639066091536
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-7b-hf
|
[
"region:us"
] |
2023-08-17T22:48:53+00:00
|
{"pretty_name": "Evaluation run of shibing624/chinese-alpaca-plus-7b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [shibing624/chinese-alpaca-plus-7b-hf](https://huggingface.co/shibing624/chinese-alpaca-plus-7b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-7b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T01:27:29.107339](https://huggingface.co/datasets/open-llm-leaderboard/details_shibing624__chinese-alpaca-plus-7b-hf/blob/main/results_2023-10-18T01-27-29.107339.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.231753355704698,\n \"em_stderr\": 0.004321186423348418,\n \"f1\": 0.28607277684563825,\n \"f1_stderr\": 0.004345067745668727,\n \"acc\": 0.35384577180220117,\n \"acc_stderr\": 0.007568088084173026\n },\n \"harness|drop|3\": {\n \"em\": 0.231753355704698,\n \"em_stderr\": 0.004321186423348418,\n \"f1\": 0.28607277684563825,\n \"f1_stderr\": 0.004345067745668727\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.002267537102254516\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7008681925808997,\n \"acc_stderr\": 0.012868639066091536\n }\n}\n```", "repo_url": "https://huggingface.co/shibing624/chinese-alpaca-plus-7b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T01_27_29.107339", "path": ["**/details_harness|drop|3_2023-10-18T01-27-29.107339.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T01-27-29.107339.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T01_27_29.107339", "path": ["**/details_harness|gsm8k|5_2023-10-18T01-27-29.107339.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T01-27-29.107339.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:03:31.157428.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:03:31.157428.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:03:31.157428.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T01_27_29.107339", "path": ["**/details_harness|winogrande|5_2023-10-18T01-27-29.107339.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T01-27-29.107339.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_03_31.157428", "path": ["results_2023-07-19T17:03:31.157428.parquet"]}, {"split": "2023_10_18T01_27_29.107339", "path": ["results_2023-10-18T01-27-29.107339.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T01-27-29.107339.parquet"]}]}]}
|
2023-10-18T00:27:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-7b-hf
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model shibing624/chinese-alpaca-plus-7b-hf on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T01:27:29.107339(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-7b-hf",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-alpaca-plus-7b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T01:27:29.107339(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-7b-hf",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-alpaca-plus-7b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T01:27:29.107339(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of shibing624/chinese-alpaca-plus-7b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-alpaca-plus-7b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T01:27:29.107339(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
09c701ca8637049c64784fec347e1d180aebd8b3
|
# Dataset Card for Evaluation run of shibing624/chinese-llama-plus-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/shibing624/chinese-llama-plus-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [shibing624/chinese-llama-plus-13b-hf](https://huggingface.co/shibing624/chinese-llama-plus-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shibing624__chinese-llama-plus-13b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T23:56:56.799721](https://huggingface.co/datasets/open-llm-leaderboard/details_shibing624__chinese-llama-plus-13b-hf/blob/main/results_2023-10-15T23-56-56.799721.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.024958053691275166,
"em_stderr": 0.001597558088314438,
"f1": 0.1507854446308717,
"f1_stderr": 0.0025774542628611845,
"acc": 0.3680836753585655,
"acc_stderr": 0.007233108836108408
},
"harness|drop|3": {
"em": 0.024958053691275166,
"em_stderr": 0.001597558088314438,
"f1": 0.1507854446308717,
"f1_stderr": 0.0025774542628611845
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948079
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268736
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_shibing624__chinese-llama-plus-13b-hf
|
[
"region:us"
] |
2023-08-17T22:49:01+00:00
|
{"pretty_name": "Evaluation run of shibing624/chinese-llama-plus-13b-hf", "dataset_summary": "Dataset automatically created during the evaluation run of model [shibing624/chinese-llama-plus-13b-hf](https://huggingface.co/shibing624/chinese-llama-plus-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shibing624__chinese-llama-plus-13b-hf\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T23:56:56.799721](https://huggingface.co/datasets/open-llm-leaderboard/details_shibing624__chinese-llama-plus-13b-hf/blob/main/results_2023-10-15T23-56-56.799721.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.024958053691275166,\n \"em_stderr\": 0.001597558088314438,\n \"f1\": 0.1507854446308717,\n \"f1_stderr\": 0.0025774542628611845,\n \"acc\": 0.3680836753585655,\n \"acc_stderr\": 0.007233108836108408\n },\n \"harness|drop|3\": {\n \"em\": 0.024958053691275166,\n \"em_stderr\": 0.001597558088314438,\n \"f1\": 0.1507854446308717,\n \"f1_stderr\": 0.0025774542628611845\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.002001305720948079\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268736\n }\n}\n```", "repo_url": "https://huggingface.co/shibing624/chinese-llama-plus-13b-hf", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|arc:challenge|25_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T23_56_56.799721", "path": ["**/details_harness|drop|3_2023-10-15T23-56-56.799721.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T23-56-56.799721.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T23_56_56.799721", "path": ["**/details_harness|gsm8k|5_2023-10-15T23-56-56.799721.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T23-56-56.799721.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hellaswag|10_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T15:48:16.269261.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T15:48:16.269261.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T15:48:16.269261.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T23_56_56.799721", "path": ["**/details_harness|winogrande|5_2023-10-15T23-56-56.799721.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T23-56-56.799721.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T15_48_16.269261", "path": ["results_2023-07-18T15:48:16.269261.parquet"]}, {"split": "2023_10_15T23_56_56.799721", "path": ["results_2023-10-15T23-56-56.799721.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T23-56-56.799721.parquet"]}]}]}
|
2023-10-15T22:57:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of shibing624/chinese-llama-plus-13b-hf
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model shibing624/chinese-llama-plus-13b-hf on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T23:56:56.799721(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of shibing624/chinese-llama-plus-13b-hf",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-llama-plus-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T23:56:56.799721(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of shibing624/chinese-llama-plus-13b-hf",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-llama-plus-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T23:56:56.799721(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of shibing624/chinese-llama-plus-13b-hf## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model shibing624/chinese-llama-plus-13b-hf on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T23:56:56.799721(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8e42926d59cafb036de04ef55047849d31d12910
|
# Dataset Card for Evaluation run of upstage/llama-30b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/llama-30b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/llama-30b-instruct](https://huggingface.co/upstage/llama-30b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__llama-30b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T15:33:08.826830](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__llama-30b-instruct/blob/main/results_2023-09-17T15-33-08.826830.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19924496644295303,
"em_stderr": 0.004090563786479079,
"f1": 0.2739314177852351,
"f1_stderr": 0.004108459298679424,
"acc": 0.46317766024223705,
"acc_stderr": 0.01006349395660694
},
"harness|drop|3": {
"em": 0.19924496644295303,
"em_stderr": 0.004090563786479079,
"f1": 0.2739314177852351,
"f1_stderr": 0.004108459298679424
},
"harness|gsm8k|5": {
"acc": 0.12130401819560273,
"acc_stderr": 0.0089928884972756
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938278
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_upstage__llama-30b-instruct
|
[
"region:us"
] |
2023-08-17T22:49:10+00:00
|
{"pretty_name": "Evaluation run of upstage/llama-30b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [upstage/llama-30b-instruct](https://huggingface.co/upstage/llama-30b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__llama-30b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T15:33:08.826830](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__llama-30b-instruct/blob/main/results_2023-09-17T15-33-08.826830.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19924496644295303,\n \"em_stderr\": 0.004090563786479079,\n \"f1\": 0.2739314177852351,\n \"f1_stderr\": 0.004108459298679424,\n \"acc\": 0.46317766024223705,\n \"acc_stderr\": 0.01006349395660694\n },\n \"harness|drop|3\": {\n \"em\": 0.19924496644295303,\n \"em_stderr\": 0.004090563786479079,\n \"f1\": 0.2739314177852351,\n \"f1_stderr\": 0.004108459298679424\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12130401819560273,\n \"acc_stderr\": 0.0089928884972756\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938278\n }\n}\n```", "repo_url": "https://huggingface.co/upstage/llama-30b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T15_33_08.826830", "path": ["**/details_harness|drop|3_2023-09-17T15-33-08.826830.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T15-33-08.826830.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T15_33_08.826830", "path": ["**/details_harness|gsm8k|5_2023-09-17T15-33-08.826830.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T15-33-08.826830.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:33:00.369415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:33:00.369415.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:33:00.369415.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T15_33_08.826830", "path": ["**/details_harness|winogrande|5_2023-09-17T15-33-08.826830.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T15-33-08.826830.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_33_00.369415", "path": ["results_2023-07-19T22:33:00.369415.parquet"]}, {"split": "2023_09_17T15_33_08.826830", "path": ["results_2023-09-17T15-33-08.826830.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T15-33-08.826830.parquet"]}]}]}
|
2023-09-17T14:33:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of upstage/llama-30b-instruct
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model upstage/llama-30b-instruct on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T15:33:08.826830(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of upstage/llama-30b-instruct",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-30b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T15:33:08.826830(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of upstage/llama-30b-instruct",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-30b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T15:33:08.826830(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of upstage/llama-30b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-30b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T15:33:08.826830(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d40bff086051baa81088f0721ca3f366fb352e54
|
# Dataset Card for Evaluation run of upstage/llama-30b-instruct-2048
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/llama-30b-instruct-2048
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/llama-30b-instruct-2048](https://huggingface.co/upstage/llama-30b-instruct-2048) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__llama-30b-instruct-2048",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T00:52:48.467311](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__llama-30b-instruct-2048/blob/main/results_2023-10-19T00-52-48.467311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.35539010067114096,
"em_stderr": 0.004901633098104223,
"f1": 0.44463611577181344,
"f1_stderr": 0.004655171488287754,
"acc": 0.48687269361101454,
"acc_stderr": 0.010937111570073342
},
"harness|drop|3": {
"em": 0.35539010067114096,
"em_stderr": 0.004901633098104223,
"f1": 0.44463611577181344,
"f1_stderr": 0.004655171488287754
},
"harness|gsm8k|5": {
"acc": 0.17816527672479152,
"acc_stderr": 0.01054013252754947
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597212
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_upstage__llama-30b-instruct-2048
|
[
"region:us"
] |
2023-08-17T22:49:19+00:00
|
{"pretty_name": "Evaluation run of upstage/llama-30b-instruct-2048", "dataset_summary": "Dataset automatically created during the evaluation run of model [upstage/llama-30b-instruct-2048](https://huggingface.co/upstage/llama-30b-instruct-2048) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__llama-30b-instruct-2048\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T00:52:48.467311](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__llama-30b-instruct-2048/blob/main/results_2023-10-19T00-52-48.467311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.35539010067114096,\n \"em_stderr\": 0.004901633098104223,\n \"f1\": 0.44463611577181344,\n \"f1_stderr\": 0.004655171488287754,\n \"acc\": 0.48687269361101454,\n \"acc_stderr\": 0.010937111570073342\n },\n \"harness|drop|3\": {\n \"em\": 0.35539010067114096,\n \"em_stderr\": 0.004901633098104223,\n \"f1\": 0.44463611577181344,\n \"f1_stderr\": 0.004655171488287754\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17816527672479152,\n \"acc_stderr\": 0.01054013252754947\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597212\n }\n}\n```", "repo_url": "https://huggingface.co/upstage/llama-30b-instruct-2048", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|arc:challenge|25_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T00_52_48.467311", "path": ["**/details_harness|drop|3_2023-10-19T00-52-48.467311.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T00-52-48.467311.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T00_52_48.467311", "path": ["**/details_harness|gsm8k|5_2023-10-19T00-52-48.467311.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T00-52-48.467311.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hellaswag|10_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T12:29:43.161348.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T12:29:43.161348.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T12:29:43.161348.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T00_52_48.467311", "path": ["**/details_harness|winogrande|5_2023-10-19T00-52-48.467311.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T00-52-48.467311.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T12_29_43.161348", "path": ["results_2023-07-19T12:29:43.161348.parquet"]}, {"split": "2023_10_19T00_52_48.467311", "path": ["results_2023-10-19T00-52-48.467311.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T00-52-48.467311.parquet"]}]}]}
|
2023-10-18T23:53:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of upstage/llama-30b-instruct-2048
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model upstage/llama-30b-instruct-2048 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T00:52:48.467311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of upstage/llama-30b-instruct-2048",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-30b-instruct-2048 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T00:52:48.467311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of upstage/llama-30b-instruct-2048",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-30b-instruct-2048 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T00:52:48.467311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of upstage/llama-30b-instruct-2048## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-30b-instruct-2048 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T00:52:48.467311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f255f85ecba8f546856e9bb450a86a379e8c831d
|
# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/Llama-2-70b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/Llama-2-70b-instruct](https://huggingface.co/upstage/Llama-2-70b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__Llama-2-70b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T12:48:24.237609](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__Llama-2-70b-instruct/blob/main/results_2023-10-17T12-48-24.237609.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.49989513422818793,
"em_stderr": 0.005120467878578845,
"f1": 0.5841736577181234,
"f1_stderr": 0.004671177225967014,
"acc": 0.5754715400500128,
"acc_stderr": 0.011730426388075654
},
"harness|drop|3": {
"em": 0.49989513422818793,
"em_stderr": 0.005120467878578845,
"f1": 0.5841736577181234,
"f1_stderr": 0.004671177225967014
},
"harness|gsm8k|5": {
"acc": 0.32221379833206976,
"acc_stderr": 0.01287243548118878
},
"harness|winogrande|5": {
"acc": 0.8287292817679558,
"acc_stderr": 0.010588417294962526
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_upstage__Llama-2-70b-instruct
|
[
"region:us"
] |
2023-08-17T22:49:28+00:00
|
{"pretty_name": "Evaluation run of upstage/Llama-2-70b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [upstage/Llama-2-70b-instruct](https://huggingface.co/upstage/Llama-2-70b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__Llama-2-70b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T12:48:24.237609](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__Llama-2-70b-instruct/blob/main/results_2023-10-17T12-48-24.237609.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.49989513422818793,\n \"em_stderr\": 0.005120467878578845,\n \"f1\": 0.5841736577181234,\n \"f1_stderr\": 0.004671177225967014,\n \"acc\": 0.5754715400500128,\n \"acc_stderr\": 0.011730426388075654\n },\n \"harness|drop|3\": {\n \"em\": 0.49989513422818793,\n \"em_stderr\": 0.005120467878578845,\n \"f1\": 0.5841736577181234,\n \"f1_stderr\": 0.004671177225967014\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32221379833206976,\n \"acc_stderr\": 0.01287243548118878\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8287292817679558,\n \"acc_stderr\": 0.010588417294962526\n }\n}\n```", "repo_url": "https://huggingface.co/upstage/Llama-2-70b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|arc:challenge|25_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T12_48_24.237609", "path": ["**/details_harness|drop|3_2023-10-17T12-48-24.237609.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T12-48-24.237609.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T12_48_24.237609", "path": ["**/details_harness|gsm8k|5_2023-10-17T12-48-24.237609.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T12-48-24.237609.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hellaswag|10_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T16:38:35.808290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T16:38:35.808290.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T16:38:35.808290.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T12_48_24.237609", "path": ["**/details_harness|winogrande|5_2023-10-17T12-48-24.237609.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T12-48-24.237609.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T16_38_35.808290", "path": ["results_2023-07-31T16:38:35.808290.parquet"]}, {"split": "2023_10_17T12_48_24.237609", "path": ["results_2023-10-17T12-48-24.237609.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T12-48-24.237609.parquet"]}]}]}
|
2023-10-17T11:48:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model upstage/Llama-2-70b-instruct on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T12:48:24.237609(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/Llama-2-70b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T12:48:24.237609(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/Llama-2-70b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T12:48:24.237609(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/Llama-2-70b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T12:48:24.237609(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e294ae331ecd5b9c9ab0b6d836bd955955dab02a
|
# Dataset Card for Evaluation run of upstage/llama-65b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/llama-65b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/llama-65b-instruct](https://huggingface.co/upstage/llama-65b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__llama-65b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T19:27:31.642045](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__llama-65b-instruct/blob/main/results_2023-10-24T19-27-31.642045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.454383389261745,
"em_stderr": 0.005099113352549085,
"f1": 0.5468970218120836,
"f1_stderr": 0.004699295426287538,
"acc": 0.5364480517576576,
"acc_stderr": 0.011564851426457474
},
"harness|drop|3": {
"em": 0.454383389261745,
"em_stderr": 0.005099113352549085,
"f1": 0.5468970218120836,
"f1_stderr": 0.004699295426287538
},
"harness|gsm8k|5": {
"acc": 0.2623199393479909,
"acc_stderr": 0.012116912419925702
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989247
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_upstage__llama-65b-instruct
|
[
"region:us"
] |
2023-08-17T22:49:37+00:00
|
{"pretty_name": "Evaluation run of upstage/llama-65b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [upstage/llama-65b-instruct](https://huggingface.co/upstage/llama-65b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__llama-65b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T19:27:31.642045](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__llama-65b-instruct/blob/main/results_2023-10-24T19-27-31.642045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.454383389261745,\n \"em_stderr\": 0.005099113352549085,\n \"f1\": 0.5468970218120836,\n \"f1_stderr\": 0.004699295426287538,\n \"acc\": 0.5364480517576576,\n \"acc_stderr\": 0.011564851426457474\n },\n \"harness|drop|3\": {\n \"em\": 0.454383389261745,\n \"em_stderr\": 0.005099113352549085,\n \"f1\": 0.5468970218120836,\n \"f1_stderr\": 0.004699295426287538\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2623199393479909,\n \"acc_stderr\": 0.012116912419925702\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989247\n }\n}\n```", "repo_url": "https://huggingface.co/upstage/llama-65b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|arc:challenge|25_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T01_44_05.835561", "path": ["**/details_harness|drop|3_2023-10-17T01-44-05.835561.parquet"]}, {"split": "2023_10_24T19_27_31.642045", "path": ["**/details_harness|drop|3_2023-10-24T19-27-31.642045.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T19-27-31.642045.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T01_44_05.835561", "path": ["**/details_harness|gsm8k|5_2023-10-17T01-44-05.835561.parquet"]}, {"split": "2023_10_24T19_27_31.642045", "path": ["**/details_harness|gsm8k|5_2023-10-24T19-27-31.642045.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T19-27-31.642045.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hellaswag|10_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T16:32:35.958499.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T16:32:35.958499.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T16:32:35.958499.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T01_44_05.835561", "path": ["**/details_harness|winogrande|5_2023-10-17T01-44-05.835561.parquet"]}, {"split": "2023_10_24T19_27_31.642045", "path": ["**/details_harness|winogrande|5_2023-10-24T19-27-31.642045.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T19-27-31.642045.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T16_32_35.958499", "path": ["results_2023-07-31T16:32:35.958499.parquet"]}, {"split": "2023_10_17T01_44_05.835561", "path": ["results_2023-10-17T01-44-05.835561.parquet"]}, {"split": "2023_10_24T19_27_31.642045", "path": ["results_2023-10-24T19-27-31.642045.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T19-27-31.642045.parquet"]}]}]}
|
2023-10-24T18:27:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of upstage/llama-65b-instruct
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model upstage/llama-65b-instruct on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T19:27:31.642045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of upstage/llama-65b-instruct",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-65b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T19:27:31.642045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of upstage/llama-65b-instruct",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-65b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T19:27:31.642045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of upstage/llama-65b-instruct## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/llama-65b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T19:27:31.642045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
2da6608913fe626322b233db1e4ca96537fa28b6
|
# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/upstage/Llama-2-70b-instruct-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [upstage/Llama-2-70b-instruct-v2](https://huggingface.co/upstage/Llama-2-70b-instruct-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__Llama-2-70b-instruct-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-03T01:46:57.047903](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__Llama-2-70b-instruct-v2/blob/main/results_2023-08-03T01%3A46%3A57.047903.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7050740464217434,
"acc_stderr": 0.03085018588043536,
"acc_norm": 0.7087855823993987,
"acc_norm_stderr": 0.03081992944181276,
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6224972679005382,
"mc2_stderr": 0.014880875055625352
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587333,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.6974706233817964,
"acc_stderr": 0.00458414401465495,
"acc_norm": 0.8789085839474209,
"acc_norm_stderr": 0.0032556675321152857
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.029674167520101453,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.029674167520101453
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7063829787234043,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.7063829787234043,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.025699352832131792,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.025699352832131792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5615763546798029,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.5615763546798029,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528436,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528436
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880242,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880242
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687968,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687968
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.04075224992216979,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.04075224992216979
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002157,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002157
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.01208670521425043,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.01208670521425043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.022289638852617893,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.022289638852617893
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6044692737430167,
"acc_stderr": 0.01635341541007577,
"acc_norm": 0.6044692737430167,
"acc_norm_stderr": 0.01635341541007577
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.020736358408060006,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.020736358408060006
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5521512385919165,
"acc_stderr": 0.012700582404768235,
"acc_norm": 0.5521512385919165,
"acc_norm_stderr": 0.012700582404768235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015575,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015575
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44430844553243576,
"mc1_stderr": 0.017394586250743173,
"mc2": 0.6224972679005382,
"mc2_stderr": 0.014880875055625352
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_upstage__Llama-2-70b-instruct-v2
|
[
"region:us"
] |
2023-08-17T22:49:45+00:00
|
{"pretty_name": "Evaluation run of upstage/Llama-2-70b-instruct-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [upstage/Llama-2-70b-instruct-v2](https://huggingface.co/upstage/Llama-2-70b-instruct-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__Llama-2-70b-instruct-v2\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-03T01:46:57.047903](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__Llama-2-70b-instruct-v2/blob/main/results_2023-08-03T01%3A46%3A57.047903.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7050740464217434,\n \"acc_stderr\": 0.03085018588043536,\n \"acc_norm\": 0.7087855823993987,\n \"acc_norm_stderr\": 0.03081992944181276,\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6224972679005382,\n \"mc2_stderr\": 0.014880875055625352\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587333,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6974706233817964,\n \"acc_stderr\": 0.00458414401465495,\n \"acc_norm\": 0.8789085839474209,\n \"acc_norm_stderr\": 0.0032556675321152857\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.029674167520101453,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.029674167520101453\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7063829787234043,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.7063829787234043,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131792,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131792\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5615763546798029,\n \"acc_stderr\": 0.03491207857486519,\n \"acc_norm\": 0.5615763546798029,\n \"acc_norm_stderr\": 0.03491207857486519\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528436,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528436\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880242,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880242\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.04075224992216979,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.04075224992216979\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617893,\n \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617893\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6044692737430167,\n \"acc_stderr\": 0.01635341541007577,\n \"acc_norm\": 0.6044692737430167,\n \"acc_norm_stderr\": 0.01635341541007577\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060006,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060006\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5521512385919165,\n \"acc_stderr\": 0.012700582404768235,\n \"acc_norm\": 0.5521512385919165,\n \"acc_norm_stderr\": 0.012700582404768235\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.6224972679005382,\n \"mc2_stderr\": 0.014880875055625352\n }\n}\n```", "repo_url": "https://huggingface.co/upstage/Llama-2-70b-instruct-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|arc:challenge|25_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hellaswag|10_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T01:46:57.047903.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T01:46:57.047903.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_03T01_46_57.047903", "path": ["results_2023-08-03T01:46:57.047903.parquet"]}, {"split": "latest", "path": ["results_2023-08-03T01:46:57.047903.parquet"]}]}]}
|
2023-08-27T11:24:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model upstage/Llama-2-70b-instruct-v2 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-03T01:46:57.047903 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/Llama-2-70b-instruct-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-03T01:46:57.047903 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/Llama-2-70b-instruct-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-03T01:46:57.047903 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of upstage/Llama-2-70b-instruct-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model upstage/Llama-2-70b-instruct-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-03T01:46:57.047903 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
122fbf1ce18dadfb30e596c461c4b2b3dd0ab859
|
# Dataset Card for Evaluation run of VMware/open-llama-0.7T-7B-open-instruct-v1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [VMware/open-llama-0.7T-7B-open-instruct-v1.1](https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T16:28:57.992845](https://huggingface.co/datasets/open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1/blob/main/results_2023-09-22T16-28-57.992845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.23406040268456377,
"em_stderr": 0.004336115943633415,
"f1": 0.28612730704698025,
"f1_stderr": 0.004340090005641948,
"acc": 0.3309415003712961,
"acc_stderr": 0.007877939232005797
},
"harness|drop|3": {
"em": 0.23406040268456377,
"em_stderr": 0.004336115943633415,
"f1": 0.28612730704698025,
"f1_stderr": 0.004340090005641948
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.002389281512077218
},
"harness|winogrande|5": {
"acc": 0.654301499605367,
"acc_stderr": 0.013366596951934375
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1
|
[
"region:us"
] |
2023-08-17T22:49:54+00:00
|
{"pretty_name": "Evaluation run of VMware/open-llama-0.7T-7B-open-instruct-v1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [VMware/open-llama-0.7T-7B-open-instruct-v1.1](https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T16:28:57.992845](https://huggingface.co/datasets/open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1/blob/main/results_2023-09-22T16-28-57.992845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23406040268456377,\n \"em_stderr\": 0.004336115943633415,\n \"f1\": 0.28612730704698025,\n \"f1_stderr\": 0.004340090005641948,\n \"acc\": 0.3309415003712961,\n \"acc_stderr\": 0.007877939232005797\n },\n \"harness|drop|3\": {\n \"em\": 0.23406040268456377,\n \"em_stderr\": 0.004336115943633415,\n \"f1\": 0.28612730704698025,\n \"f1_stderr\": 0.004340090005641948\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.002389281512077218\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.013366596951934375\n }\n}\n```", "repo_url": "https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T16_28_57.992845", "path": ["**/details_harness|drop|3_2023-09-22T16-28-57.992845.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T16-28-57.992845.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T16_28_57.992845", "path": ["**/details_harness|gsm8k|5_2023-09-22T16-28-57.992845.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T16-28-57.992845.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:57:28.493539.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:57:28.493539.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:57:28.493539.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T16_28_57.992845", "path": ["**/details_harness|winogrande|5_2023-09-22T16-28-57.992845.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T16-28-57.992845.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_57_28.493539", "path": ["results_2023-07-19T16:57:28.493539.parquet"]}, {"split": "2023_09_22T16_28_57.992845", "path": ["results_2023-09-22T16-28-57.992845.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T16-28-57.992845.parquet"]}]}]}
|
2023-09-22T15:29:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of VMware/open-llama-0.7T-7B-open-instruct-v1.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model VMware/open-llama-0.7T-7B-open-instruct-v1.1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T16:28:57.992845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of VMware/open-llama-0.7T-7B-open-instruct-v1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model VMware/open-llama-0.7T-7B-open-instruct-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T16:28:57.992845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of VMware/open-llama-0.7T-7B-open-instruct-v1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model VMware/open-llama-0.7T-7B-open-instruct-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T16:28:57.992845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of VMware/open-llama-0.7T-7B-open-instruct-v1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model VMware/open-llama-0.7T-7B-open-instruct-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T16:28:57.992845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
004e465a2b680834bf5ba6e3b2cc6e4c598fa16b
|
# Dataset Card for Evaluation run of augtoma/qCammel-13
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/augtoma/qCammel-13
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [augtoma/qCammel-13](https://huggingface.co/augtoma/qCammel-13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_augtoma__qCammel-13",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T23:25:48.573566](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-13/blob/main/results_2023-09-17T23-25-48.573566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004614093959731544,
"em_stderr": 0.0006940305886353496,
"f1": 0.06571308724832206,
"f1_stderr": 0.0014345437329154143,
"acc": 0.4376820951511304,
"acc_stderr": 0.01035987939936818
},
"harness|drop|3": {
"em": 0.004614093959731544,
"em_stderr": 0.0006940305886353496,
"f1": 0.06571308724832206,
"f1_stderr": 0.0014345437329154143
},
"harness|gsm8k|5": {
"acc": 0.11372251705837756,
"acc_stderr": 0.008744810131034047
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702313
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_augtoma__qCammel-13
|
[
"region:us"
] |
2023-08-17T22:50:02+00:00
|
{"pretty_name": "Evaluation run of augtoma/qCammel-13", "dataset_summary": "Dataset automatically created during the evaluation run of model [augtoma/qCammel-13](https://huggingface.co/augtoma/qCammel-13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_augtoma__qCammel-13\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T23:25:48.573566](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-13/blob/main/results_2023-09-17T23-25-48.573566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004614093959731544,\n \"em_stderr\": 0.0006940305886353496,\n \"f1\": 0.06571308724832206,\n \"f1_stderr\": 0.0014345437329154143,\n \"acc\": 0.4376820951511304,\n \"acc_stderr\": 0.01035987939936818\n },\n \"harness|drop|3\": {\n \"em\": 0.004614093959731544,\n \"em_stderr\": 0.0006940305886353496,\n \"f1\": 0.06571308724832206,\n \"f1_stderr\": 0.0014345437329154143\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \"acc_stderr\": 0.008744810131034047\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702313\n }\n}\n```", "repo_url": "https://huggingface.co/augtoma/qCammel-13", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|arc:challenge|25_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T23_25_48.573566", "path": ["**/details_harness|drop|3_2023-09-17T23-25-48.573566.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T23-25-48.573566.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T23_25_48.573566", "path": ["**/details_harness|gsm8k|5_2023-09-17T23-25-48.573566.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T23-25-48.573566.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hellaswag|10_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T11:13:38.716664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T11:13:38.716664.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T11:13:38.716664.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T23_25_48.573566", "path": ["**/details_harness|winogrande|5_2023-09-17T23-25-48.573566.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T23-25-48.573566.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T11_13_38.716664", "path": ["results_2023-07-25T11:13:38.716664.parquet"]}, {"split": "2023_09_17T23_25_48.573566", "path": ["results_2023-09-17T23-25-48.573566.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T23-25-48.573566.parquet"]}]}]}
|
2023-09-17T22:26:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of augtoma/qCammel-13
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model augtoma/qCammel-13 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T23:25:48.573566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of augtoma/qCammel-13",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model augtoma/qCammel-13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T23:25:48.573566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of augtoma/qCammel-13",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model augtoma/qCammel-13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T23:25:48.573566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
16,
31,
164,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of augtoma/qCammel-13## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model augtoma/qCammel-13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T23:25:48.573566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3701e2ecd80c751c62c0421c62aa6f5e5a5f19f4
|
# Dataset Card for Evaluation run of augtoma/qCammel-70-x
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/augtoma/qCammel-70-x
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [augtoma/qCammel-70-x](https://huggingface.co/augtoma/qCammel-70-x) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_augtoma__qCammel-70-x",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T15:29:16.459278](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70-x/blob/main/results_2023-10-18T15-29-16.459278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.033766778523489936,
"em_stderr": 0.001849802869119515,
"f1": 0.10340918624161041,
"f1_stderr": 0.0022106009828094797,
"acc": 0.5700654570173166,
"acc_stderr": 0.011407494958111332
},
"harness|drop|3": {
"em": 0.033766778523489936,
"em_stderr": 0.001849802869119515,
"f1": 0.10340918624161041,
"f1_stderr": 0.0022106009828094797
},
"harness|gsm8k|5": {
"acc": 0.2971948445792267,
"acc_stderr": 0.012588685966624186
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598479
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_augtoma__qCammel-70-x
|
[
"region:us"
] |
2023-08-17T22:50:11+00:00
|
{"pretty_name": "Evaluation run of augtoma/qCammel-70-x", "dataset_summary": "Dataset automatically created during the evaluation run of model [augtoma/qCammel-70-x](https://huggingface.co/augtoma/qCammel-70-x) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_augtoma__qCammel-70-x\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T15:29:16.459278](https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70-x/blob/main/results_2023-10-18T15-29-16.459278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.033766778523489936,\n \"em_stderr\": 0.001849802869119515,\n \"f1\": 0.10340918624161041,\n \"f1_stderr\": 0.0022106009828094797,\n \"acc\": 0.5700654570173166,\n \"acc_stderr\": 0.011407494958111332\n },\n \"harness|drop|3\": {\n \"em\": 0.033766778523489936,\n \"em_stderr\": 0.001849802869119515,\n \"f1\": 0.10340918624161041,\n \"f1_stderr\": 0.0022106009828094797\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2971948445792267,\n \"acc_stderr\": 0.012588685966624186\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598479\n }\n}\n```", "repo_url": "https://huggingface.co/augtoma/qCammel-70-x", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|arc:challenge|25_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T15_29_16.459278", "path": ["**/details_harness|drop|3_2023-10-18T15-29-16.459278.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T15-29-16.459278.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T15_29_16.459278", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-29-16.459278.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-29-16.459278.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hellaswag|10_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T21:18:05.927693.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T21:18:05.927693.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T21:18:05.927693.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T15_29_16.459278", "path": ["**/details_harness|winogrande|5_2023-10-18T15-29-16.459278.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T15-29-16.459278.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T21_18_05.927693", "path": ["results_2023-07-31T21:18:05.927693.parquet"]}, {"split": "2023_10_18T15_29_16.459278", "path": ["results_2023-10-18T15-29-16.459278.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T15-29-16.459278.parquet"]}]}]}
|
2023-10-18T14:29:29+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of augtoma/qCammel-70-x
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model augtoma/qCammel-70-x on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T15:29:16.459278(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of augtoma/qCammel-70-x",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model augtoma/qCammel-70-x on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T15:29:16.459278(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of augtoma/qCammel-70-x",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model augtoma/qCammel-70-x on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T15:29:16.459278(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of augtoma/qCammel-70-x## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model augtoma/qCammel-70-x on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T15:29:16.459278(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
926a4398f0628234f8a8c0b809ef06a074f8b5d1
|
# Dataset Card for Evaluation run of layoric/llama-2-13b-code-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/layoric/llama-2-13b-code-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [layoric/llama-2-13b-code-alpaca](https://huggingface.co/layoric/llama-2-13b-code-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T08:33:30.933109](https://huggingface.co/datasets/open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca/blob/main/results_2023-09-17T08-33-30.933109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589575,
"f1": 0.06352139261744941,
"f1_stderr": 0.001394404442569597,
"acc": 0.4415195195231134,
"acc_stderr": 0.010426765880718628
},
"harness|drop|3": {
"em": 0.0018875838926174498,
"em_stderr": 0.00044451099905589575,
"f1": 0.06352139261744941,
"f1_stderr": 0.001394404442569597
},
"harness|gsm8k|5": {
"acc": 0.11902956785443518,
"acc_stderr": 0.008919702911161632
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275625
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca
|
[
"region:us"
] |
2023-08-17T22:50:19+00:00
|
{"pretty_name": "Evaluation run of layoric/llama-2-13b-code-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [layoric/llama-2-13b-code-alpaca](https://huggingface.co/layoric/llama-2-13b-code-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T08:33:30.933109](https://huggingface.co/datasets/open-llm-leaderboard/details_layoric__llama-2-13b-code-alpaca/blob/main/results_2023-09-17T08-33-30.933109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589575,\n \"f1\": 0.06352139261744941,\n \"f1_stderr\": 0.001394404442569597,\n \"acc\": 0.4415195195231134,\n \"acc_stderr\": 0.010426765880718628\n },\n \"harness|drop|3\": {\n \"em\": 0.0018875838926174498,\n \"em_stderr\": 0.00044451099905589575,\n \"f1\": 0.06352139261744941,\n \"f1_stderr\": 0.001394404442569597\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11902956785443518,\n \"acc_stderr\": 0.008919702911161632\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n }\n}\n```", "repo_url": "https://huggingface.co/layoric/llama-2-13b-code-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T08_33_30.933109", "path": ["**/details_harness|drop|3_2023-09-17T08-33-30.933109.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T08-33-30.933109.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T08_33_30.933109", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-33-30.933109.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-33-30.933109.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:43:19.893957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:43:19.893957.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:43:19.893957.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T08_33_30.933109", "path": ["**/details_harness|winogrande|5_2023-09-17T08-33-30.933109.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T08-33-30.933109.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T14_43_19.893957", "path": ["results_2023-07-24T14:43:19.893957.parquet"]}, {"split": "2023_09_17T08_33_30.933109", "path": ["results_2023-09-17T08-33-30.933109.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T08-33-30.933109.parquet"]}]}]}
|
2023-09-17T07:33:42+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of layoric/llama-2-13b-code-alpaca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model layoric/llama-2-13b-code-alpaca on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T08:33:30.933109(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of layoric/llama-2-13b-code-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model layoric/llama-2-13b-code-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T08:33:30.933109(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of layoric/llama-2-13b-code-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model layoric/llama-2-13b-code-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T08:33:30.933109(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
68,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of layoric/llama-2-13b-code-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model layoric/llama-2-13b-code-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T08:33:30.933109(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9547f8324c5e6b6ec8f655c4d4ec88326a0e399c
|
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-llama-2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-llama-2-7b](https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Hermes-llama-2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T01:50:03.524306](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-llama-2-7b/blob/main/results_2023-10-22T01-50-03.524306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.14649748322147652,
"em_stderr": 0.0036212385599472124,
"f1": 0.21412122483221444,
"f1_stderr": 0.0037396442766702157,
"acc": 0.3989754501778092,
"acc_stderr": 0.009370647012687763
},
"harness|drop|3": {
"em": 0.14649748322147652,
"em_stderr": 0.0036212385599472124,
"f1": 0.21412122483221444,
"f1_stderr": 0.0037396442766702157
},
"harness|gsm8k|5": {
"acc": 0.0576194086429113,
"acc_stderr": 0.006418593319822861
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_NousResearch__Nous-Hermes-llama-2-7b
|
[
"region:us"
] |
2023-08-17T22:50:28+00:00
|
{"pretty_name": "Evaluation run of NousResearch/Nous-Hermes-llama-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-llama-2-7b](https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Hermes-llama-2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T01:50:03.524306](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-llama-2-7b/blob/main/results_2023-10-22T01-50-03.524306.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14649748322147652,\n \"em_stderr\": 0.0036212385599472124,\n \"f1\": 0.21412122483221444,\n \"f1_stderr\": 0.0037396442766702157,\n \"acc\": 0.3989754501778092,\n \"acc_stderr\": 0.009370647012687763\n },\n \"harness|drop|3\": {\n \"em\": 0.14649748322147652,\n \"em_stderr\": 0.0036212385599472124,\n \"f1\": 0.21412122483221444,\n \"f1_stderr\": 0.0037396442766702157\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0576194086429113,\n \"acc_stderr\": 0.006418593319822861\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n }\n}\n```", "repo_url": "https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|arc:challenge|25_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T01_50_03.524306", "path": ["**/details_harness|drop|3_2023-10-22T01-50-03.524306.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T01-50-03.524306.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T01_50_03.524306", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-50-03.524306.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-50-03.524306.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hellaswag|10_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T15:03:15.265717.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T15:03:15.265717.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T15:03:15.265717.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T01_50_03.524306", "path": ["**/details_harness|winogrande|5_2023-10-22T01-50-03.524306.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T01-50-03.524306.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T15_03_15.265717", "path": ["results_2023-07-31T15:03:15.265717.parquet"]}, {"split": "2023_10_22T01_50_03.524306", "path": ["results_2023-10-22T01-50-03.524306.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T01-50-03.524306.parquet"]}]}]}
|
2023-10-22T00:50:15+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-llama-2-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-llama-2-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T01:50:03.524306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-llama-2-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T01:50:03.524306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-llama-2-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T01:50:03.524306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-llama-2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T01:50:03.524306(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
bcf293f0bf13fb25ac4481b2f86bac8d85e48483
|
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-Llama2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-Llama2-13b](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Hermes-Llama2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T23:27:15.868927](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-Llama2-13b/blob/main/results_2023-10-21T23-27-15.868927.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.22934144295302014,
"em_stderr": 0.004305384313236111,
"f1": 0.30605285234899415,
"f1_stderr": 0.004296224150122663,
"acc": 0.4276861222626263,
"acc_stderr": 0.010194652064655127
},
"harness|drop|3": {
"em": 0.22934144295302014,
"em_stderr": 0.004305384313236111,
"f1": 0.30605285234899415,
"f1_stderr": 0.004296224150122663
},
"harness|gsm8k|5": {
"acc": 0.10083396512509477,
"acc_stderr": 0.008294031192126607
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183647
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_NousResearch__Nous-Hermes-Llama2-13b
|
[
"region:us"
] |
2023-08-17T22:50:36+00:00
|
{"pretty_name": "Evaluation run of NousResearch/Nous-Hermes-Llama2-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-Llama2-13b](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Hermes-Llama2-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T23:27:15.868927](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-Llama2-13b/blob/main/results_2023-10-21T23-27-15.868927.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22934144295302014,\n \"em_stderr\": 0.004305384313236111,\n \"f1\": 0.30605285234899415,\n \"f1_stderr\": 0.004296224150122663,\n \"acc\": 0.4276861222626263,\n \"acc_stderr\": 0.010194652064655127\n },\n \"harness|drop|3\": {\n \"em\": 0.22934144295302014,\n \"em_stderr\": 0.004305384313236111,\n \"f1\": 0.30605285234899415,\n \"f1_stderr\": 0.004296224150122663\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10083396512509477,\n \"acc_stderr\": 0.008294031192126607\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183647\n }\n}\n```", "repo_url": "https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|arc:challenge|25_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|arc:challenge|25_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T20_14_41.722716", "path": ["**/details_harness|drop|3_2023-10-21T20-14-41.722716.parquet"]}, {"split": "2023_10_21T21_17_49.044019", "path": ["**/details_harness|drop|3_2023-10-21T21-17-49.044019.parquet"]}, {"split": "2023_10_21T23_27_15.868927", "path": ["**/details_harness|drop|3_2023-10-21T23-27-15.868927.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T23-27-15.868927.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T20_14_41.722716", "path": ["**/details_harness|gsm8k|5_2023-10-21T20-14-41.722716.parquet"]}, {"split": "2023_10_21T21_17_49.044019", "path": ["**/details_harness|gsm8k|5_2023-10-21T21-17-49.044019.parquet"]}, {"split": "2023_10_21T23_27_15.868927", "path": ["**/details_harness|gsm8k|5_2023-10-21T23-27-15.868927.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T23-27-15.868927.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hellaswag|10_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hellaswag|10_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:44:05.322938.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T11:02:46.466402.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T14:55:06.636628.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T14:55:06.636628.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T14:55:06.636628.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T20_14_41.722716", "path": ["**/details_harness|winogrande|5_2023-10-21T20-14-41.722716.parquet"]}, {"split": "2023_10_21T21_17_49.044019", "path": ["**/details_harness|winogrande|5_2023-10-21T21-17-49.044019.parquet"]}, {"split": "2023_10_21T23_27_15.868927", "path": ["**/details_harness|winogrande|5_2023-10-21T23-27-15.868927.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T23-27-15.868927.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T14_44_05.322938", "path": ["results_2023-07-24T14:44:05.322938.parquet"]}, {"split": "2023_07_25T11_02_46.466402", "path": ["results_2023-07-25T11:02:46.466402.parquet"]}, {"split": "2023_07_26T14_55_06.636628", "path": ["results_2023-07-26T14:55:06.636628.parquet"]}, {"split": "2023_10_21T20_14_41.722716", "path": ["results_2023-10-21T20-14-41.722716.parquet"]}, {"split": "2023_10_21T21_17_49.044019", "path": ["results_2023-10-21T21-17-49.044019.parquet"]}, {"split": "2023_10_21T23_27_15.868927", "path": ["results_2023-10-21T23-27-15.868927.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T23-27-15.868927.parquet"]}]}]}
|
2023-10-21T22:27:28+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-Llama2-13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-Llama2-13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T23:27:15.868927(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-Llama2-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-Llama2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T23:27:15.868927(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-Llama2-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-Llama2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T23:27:15.868927(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-Llama2-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-Llama2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T23:27:15.868927(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
4874f3c7c4e17441616d7b564996d4da01f29b5c
|
# Dataset Card for Evaluation run of NousResearch/Redmond-Puffin-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Redmond-Puffin-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Redmond-Puffin-13B](https://huggingface.co/NousResearch/Redmond-Puffin-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Redmond-Puffin-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T14:40:40.594002](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Redmond-Puffin-13B/blob/main/results_2023-10-19T14-40-40.594002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669685,
"f1": 0.06032822986577185,
"f1_stderr": 0.0013617956382083536,
"acc": 0.4385024770026802,
"acc_stderr": 0.01030687565094663
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626669685,
"f1": 0.06032822986577185,
"f1_stderr": 0.0013617956382083536
},
"harness|gsm8k|5": {
"acc": 0.11220621683093253,
"acc_stderr": 0.00869374313824238
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650882
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_NousResearch__Redmond-Puffin-13B
|
[
"region:us"
] |
2023-08-17T22:50:56+00:00
|
{"pretty_name": "Evaluation run of NousResearch/Redmond-Puffin-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [NousResearch/Redmond-Puffin-13B](https://huggingface.co/NousResearch/Redmond-Puffin-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Redmond-Puffin-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T14:40:40.594002](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Redmond-Puffin-13B/blob/main/results_2023-10-19T14-40-40.594002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669685,\n \"f1\": 0.06032822986577185,\n \"f1_stderr\": 0.0013617956382083536,\n \"acc\": 0.4385024770026802,\n \"acc_stderr\": 0.01030687565094663\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626669685,\n \"f1\": 0.06032822986577185,\n \"f1_stderr\": 0.0013617956382083536\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11220621683093253,\n \"acc_stderr\": 0.00869374313824238\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650882\n }\n}\n```", "repo_url": "https://huggingface.co/NousResearch/Redmond-Puffin-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|arc:challenge|25_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T01_48_06.701008", "path": ["**/details_harness|drop|3_2023-10-18T01-48-06.701008.parquet"]}, {"split": "2023_10_19T14_40_40.594002", "path": ["**/details_harness|drop|3_2023-10-19T14-40-40.594002.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T14-40-40.594002.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T01_48_06.701008", "path": ["**/details_harness|gsm8k|5_2023-10-18T01-48-06.701008.parquet"]}, {"split": "2023_10_19T14_40_40.594002", "path": ["**/details_harness|gsm8k|5_2023-10-19T14-40-40.594002.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T14-40-40.594002.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hellaswag|10_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T14:36:07.179231.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:58:43.573402.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:58:43.573402.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:58:43.573402.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T01_48_06.701008", "path": ["**/details_harness|winogrande|5_2023-10-18T01-48-06.701008.parquet"]}, {"split": "2023_10_19T14_40_40.594002", "path": ["**/details_harness|winogrande|5_2023-10-19T14-40-40.594002.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T14-40-40.594002.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T14_36_07.179231", "path": ["results_2023-07-24T14:36:07.179231.parquet"]}, {"split": "2023_07_25T10_58_43.573402", "path": ["results_2023-07-25T10:58:43.573402.parquet"]}, {"split": "2023_10_18T01_48_06.701008", "path": ["results_2023-10-18T01-48-06.701008.parquet"]}, {"split": "2023_10_19T14_40_40.594002", "path": ["results_2023-10-19T14-40-40.594002.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T14-40-40.594002.parquet"]}]}]}
|
2023-10-19T13:40:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of NousResearch/Redmond-Puffin-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model NousResearch/Redmond-Puffin-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T14:40:40.594002(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of NousResearch/Redmond-Puffin-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Redmond-Puffin-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T14:40:40.594002(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NousResearch/Redmond-Puffin-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Redmond-Puffin-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T14:40:40.594002(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NousResearch/Redmond-Puffin-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Redmond-Puffin-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T14:40:40.594002(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f60d961f8d6c623d7d4ef35a0c80915001829503
|
# Dataset Card for "PKDD_GPT2_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_GPT2_Baseline
|
[
"region:us"
] |
2023-08-17T22:50:57+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 38536305.0, "num_examples": 12500}], "download_size": 211867982, "dataset_size": 154145212.5}}
|
2023-08-17T22:56:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_GPT2_Baseline"
More Information needed
|
[
"# Dataset Card for \"PKDD_GPT2_Baseline\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_GPT2_Baseline\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_GPT2_Baseline\"\n\nMore Information needed"
] |
0e045ec68762ce6eb7101bb33798dfa340730611
|
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/Nous-Hermes-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-13b](https://huggingface.co/NousResearch/Nous-Hermes-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T04:00:41.897332](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b/blob/main/results_2023-10-19T04-00-41.897332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2930998322147651,
"em_stderr": 0.00466150847986569,
"f1": 0.37501048657718355,
"f1_stderr": 0.004576570475121802,
"acc": 0.41817812997218123,
"acc_stderr": 0.009868526609981134
},
"harness|drop|3": {
"em": 0.2930998322147651,
"em_stderr": 0.00466150847986569,
"f1": 0.37501048657718355,
"f1_stderr": 0.004576570475121802
},
"harness|gsm8k|5": {
"acc": 0.08339651250947688,
"acc_stderr": 0.00761565027710669
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855576
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b
|
[
"region:us"
] |
2023-08-17T22:51:13+00:00
|
{"pretty_name": "Evaluation run of NousResearch/Nous-Hermes-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-13b](https://huggingface.co/NousResearch/Nous-Hermes-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T04:00:41.897332](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-13b/blob/main/results_2023-10-19T04-00-41.897332.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2930998322147651,\n \"em_stderr\": 0.00466150847986569,\n \"f1\": 0.37501048657718355,\n \"f1_stderr\": 0.004576570475121802,\n \"acc\": 0.41817812997218123,\n \"acc_stderr\": 0.009868526609981134\n },\n \"harness|drop|3\": {\n \"em\": 0.2930998322147651,\n \"em_stderr\": 0.00466150847986569,\n \"f1\": 0.37501048657718355,\n \"f1_stderr\": 0.004576570475121802\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08339651250947688,\n \"acc_stderr\": 0.00761565027710669\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855576\n }\n}\n```", "repo_url": "https://huggingface.co/NousResearch/Nous-Hermes-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|arc:challenge|25_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T04_00_41.897332", "path": ["**/details_harness|drop|3_2023-10-19T04-00-41.897332.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T04-00-41.897332.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T04_00_41.897332", "path": ["**/details_harness|gsm8k|5_2023-10-19T04-00-41.897332.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T04-00-41.897332.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hellaswag|10_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T15:33:41.626742.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T15:33:41.626742.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T15:33:41.626742.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T04_00_41.897332", "path": ["**/details_harness|winogrande|5_2023-10-19T04-00-41.897332.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T04-00-41.897332.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T15_33_41.626742", "path": ["results_2023-07-18T15:33:41.626742.parquet"]}, {"split": "2023_10_19T04_00_41.897332", "path": ["results_2023-10-19T04-00-41.897332.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T04-00-41.897332.parquet"]}]}]}
|
2023-10-19T03:00:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T04:00:41.897332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T04:00:41.897332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T04:00:41.897332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model NousResearch/Nous-Hermes-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T04:00:41.897332(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e1abd0949627f6dd24facfcf0cf107d239fb29e3
|
# Dataset Card for Evaluation run of jphme/orca_mini_v2_ger_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jphme/orca_mini_v2_ger_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jphme/orca_mini_v2_ger_7b](https://huggingface.co/jphme/orca_mini_v2_ger_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jphme__orca_mini_v2_ger_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T20:22:22.461526](https://huggingface.co/datasets/open-llm-leaderboard/details_jphme__orca_mini_v2_ger_7b/blob/main/results_2023-09-17T20-22-22.461526.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.05180369127516778,
"em_stderr": 0.002269703538491734,
"f1": 0.10419043624161092,
"f1_stderr": 0.0025209765448865502,
"acc": 0.3787812512528625,
"acc_stderr": 0.009090798922474245
},
"harness|drop|3": {
"em": 0.05180369127516778,
"em_stderr": 0.002269703538491734,
"f1": 0.10419043624161092,
"f1_stderr": 0.0025209765448865502
},
"harness|gsm8k|5": {
"acc": 0.04169825625473844,
"acc_stderr": 0.005506205058175767
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.012675392786772722
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jphme__orca_mini_v2_ger_7b
|
[
"region:us"
] |
2023-08-17T22:51:22+00:00
|
{"pretty_name": "Evaluation run of jphme/orca_mini_v2_ger_7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jphme/orca_mini_v2_ger_7b](https://huggingface.co/jphme/orca_mini_v2_ger_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jphme__orca_mini_v2_ger_7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T20:22:22.461526](https://huggingface.co/datasets/open-llm-leaderboard/details_jphme__orca_mini_v2_ger_7b/blob/main/results_2023-09-17T20-22-22.461526.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05180369127516778,\n \"em_stderr\": 0.002269703538491734,\n \"f1\": 0.10419043624161092,\n \"f1_stderr\": 0.0025209765448865502,\n \"acc\": 0.3787812512528625,\n \"acc_stderr\": 0.009090798922474245\n },\n \"harness|drop|3\": {\n \"em\": 0.05180369127516778,\n \"em_stderr\": 0.002269703538491734,\n \"f1\": 0.10419043624161092,\n \"f1_stderr\": 0.0025209765448865502\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04169825625473844,\n \"acc_stderr\": 0.005506205058175767\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.012675392786772722\n }\n}\n```", "repo_url": "https://huggingface.co/jphme/orca_mini_v2_ger_7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T20_22_22.461526", "path": ["**/details_harness|drop|3_2023-09-17T20-22-22.461526.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T20-22-22.461526.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T20_22_22.461526", "path": ["**/details_harness|gsm8k|5_2023-09-17T20-22-22.461526.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T20-22-22.461526.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:09:14.589500.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:09:14.589500.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:09:14.589500.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T20_22_22.461526", "path": ["**/details_harness|winogrande|5_2023-09-17T20-22-22.461526.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T20-22-22.461526.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_09_14.589500", "path": ["results_2023-07-19T17:09:14.589500.parquet"]}, {"split": "2023_09_17T20_22_22.461526", "path": ["results_2023-09-17T20-22-22.461526.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T20-22-22.461526.parquet"]}]}]}
|
2023-09-17T19:22:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jphme/orca_mini_v2_ger_7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jphme/orca_mini_v2_ger_7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T20:22:22.461526(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jphme/orca_mini_v2_ger_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jphme/orca_mini_v2_ger_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T20:22:22.461526(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jphme/orca_mini_v2_ger_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jphme/orca_mini_v2_ger_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T20:22:22.461526(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jphme/orca_mini_v2_ger_7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jphme/orca_mini_v2_ger_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T20:22:22.461526(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
1236c3ee96594ce5dec44c8b985501149347960d
|
# Dataset Card for Evaluation run of vicgalle/alpaca-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vicgalle/alpaca-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [vicgalle/alpaca-7b](https://huggingface.co/vicgalle/alpaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__alpaca-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T00:32:06.511354](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__alpaca-7b/blob/main/results_2023-10-29T00-32-06.511354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 4.0897651006711414e-05,
"f1_stderr": 1.4411331225340808e-05,
"acc": 0.24861878453038674,
"acc_stderr": 0.007026135605808218
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 4.0897651006711414e-05,
"f1_stderr": 1.4411331225340808e-05
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616436
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_vicgalle__alpaca-7b
|
[
"region:us"
] |
2023-08-17T22:51:30+00:00
|
{"pretty_name": "Evaluation run of vicgalle/alpaca-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/alpaca-7b](https://huggingface.co/vicgalle/alpaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__alpaca-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T00:32:06.511354](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__alpaca-7b/blob/main/results_2023-10-29T00-32-06.511354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 4.0897651006711414e-05,\n \"f1_stderr\": 1.4411331225340808e-05,\n \"acc\": 0.24861878453038674,\n \"acc_stderr\": 0.007026135605808218\n },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\": 0.0,\n \"f1\": 4.0897651006711414e-05,\n \"f1_stderr\": 1.4411331225340808e-05\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616436\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/alpaca-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T00_32_06.511354", "path": ["**/details_harness|drop|3_2023-10-29T00-32-06.511354.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T00-32-06.511354.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T00_32_06.511354", "path": ["**/details_harness|gsm8k|5_2023-10-29T00-32-06.511354.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T00-32-06.511354.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T12:34:16.138888.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:34:16.138888.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T12:34:16.138888.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T00_32_06.511354", "path": ["**/details_harness|winogrande|5_2023-10-29T00-32-06.511354.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T00-32-06.511354.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T12_34_16.138888", "path": ["results_2023-07-18T12:34:16.138888.parquet"]}, {"split": "2023_10_29T00_32_06.511354", "path": ["results_2023-10-29T00-32-06.511354.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T00-32-06.511354.parquet"]}]}]}
|
2023-10-28T23:32:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of vicgalle/alpaca-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model vicgalle/alpaca-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-29T00:32:06.511354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of vicgalle/alpaca-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/alpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T00:32:06.511354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vicgalle/alpaca-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/alpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T00:32:06.511354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vicgalle/alpaca-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/alpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T00:32:06.511354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9f079804b6a879530a9b3ebcdf25939360f08d06
|
# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vicgalle/gpt2-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [vicgalle/gpt2-alpaca](https://huggingface.co/vicgalle/gpt2-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__gpt2-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T17:31:40.228869](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__gpt2-alpaca/blob/main/results_2023-09-22T17-31-40.228869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.009542785234899329,
"em_stderr": 0.0009956233793266855,
"f1": 0.05457529362416121,
"f1_stderr": 0.001605303697316422,
"acc": 0.2533543804262036,
"acc_stderr": 0.0070256103461651745
},
"harness|drop|3": {
"em": 0.009542785234899329,
"em_stderr": 0.0009956233793266855,
"f1": 0.05457529362416121,
"f1_stderr": 0.001605303697316422
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5067087608524072,
"acc_stderr": 0.014051220692330349
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_vicgalle__gpt2-alpaca
|
[
"region:us"
] |
2023-08-17T22:51:39+00:00
|
{"pretty_name": "Evaluation run of vicgalle/gpt2-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/gpt2-alpaca](https://huggingface.co/vicgalle/gpt2-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__gpt2-alpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T17:31:40.228869](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__gpt2-alpaca/blob/main/results_2023-09-22T17-31-40.228869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.009542785234899329,\n \"em_stderr\": 0.0009956233793266855,\n \"f1\": 0.05457529362416121,\n \"f1_stderr\": 0.001605303697316422,\n \"acc\": 0.2533543804262036,\n \"acc_stderr\": 0.0070256103461651745\n },\n \"harness|drop|3\": {\n \"em\": 0.009542785234899329,\n \"em_stderr\": 0.0009956233793266855,\n \"f1\": 0.05457529362416121,\n \"f1_stderr\": 0.001605303697316422\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5067087608524072,\n \"acc_stderr\": 0.014051220692330349\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/gpt2-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T17_31_40.228869", "path": ["**/details_harness|drop|3_2023-09-22T17-31-40.228869.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T17-31-40.228869.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T17_31_40.228869", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-31-40.228869.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T17-31-40.228869.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:35:22.548714.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:35:22.548714.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:35:22.548714.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T17_31_40.228869", "path": ["**/details_harness|winogrande|5_2023-09-22T17-31-40.228869.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T17-31-40.228869.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T10_35_22.548714", "path": ["results_2023-07-19T10:35:22.548714.parquet"]}, {"split": "2023_09_22T17_31_40.228869", "path": ["results_2023-09-22T17-31-40.228869.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T17-31-40.228869.parquet"]}]}]}
|
2023-09-22T16:31:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model vicgalle/gpt2-alpaca on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T17:31:40.228869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/gpt2-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T17:31:40.228869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/gpt2-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T17:31:40.228869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/gpt2-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T17:31:40.228869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a83aec27aee2586b9bce061101c3b48c1559a13a
|
# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca-gpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vicgalle/gpt2-alpaca-gpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [vicgalle/gpt2-alpaca-gpt4](https://huggingface.co/vicgalle/gpt2-alpaca-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__gpt2-alpaca-gpt4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T08:11:17.165801](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__gpt2-alpaca-gpt4/blob/main/results_2023-10-13T08-11-17.165801.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436451,
"f1": 0.0483462667785236,
"f1_stderr": 0.0013978558370896523,
"acc": 0.26236870748869207,
"acc_stderr": 0.007776906388854586
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436451,
"f1": 0.0483462667785236,
"f1_stderr": 0.0013978558370896523
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245457
},
"harness|winogrande|5": {
"acc": 0.5217048145224941,
"acc_stderr": 0.014039239216484626
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_vicgalle__gpt2-alpaca-gpt4
|
[
"region:us"
] |
2023-08-17T22:51:48+00:00
|
{"pretty_name": "Evaluation run of vicgalle/gpt2-alpaca-gpt4", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/gpt2-alpaca-gpt4](https://huggingface.co/vicgalle/gpt2-alpaca-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__gpt2-alpaca-gpt4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T08:11:17.165801](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__gpt2-alpaca-gpt4/blob/main/results_2023-10-13T08-11-17.165801.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436451,\n \"f1\": 0.0483462667785236,\n \"f1_stderr\": 0.0013978558370896523,\n \"acc\": 0.26236870748869207,\n \"acc_stderr\": 0.007776906388854586\n },\n \"harness|drop|3\": {\n \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436451,\n \"f1\": 0.0483462667785236,\n \"f1_stderr\": 0.0013978558370896523\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245457\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5217048145224941,\n \"acc_stderr\": 0.014039239216484626\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/gpt2-alpaca-gpt4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T08_11_17.165801", "path": ["**/details_harness|drop|3_2023-10-13T08-11-17.165801.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T08-11-17.165801.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T08_11_17.165801", "path": ["**/details_harness|gsm8k|5_2023-10-13T08-11-17.165801.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T08-11-17.165801.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:37:55.436253.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:37:55.436253.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:37:55.436253.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T08_11_17.165801", "path": ["**/details_harness|winogrande|5_2023-10-13T08-11-17.165801.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T08-11-17.165801.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T10_37_55.436253", "path": ["results_2023-07-19T10:37:55.436253.parquet"]}, {"split": "2023_10_13T08_11_17.165801", "path": ["results_2023-10-13T08-11-17.165801.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T08-11-17.165801.parquet"]}]}]}
|
2023-10-13T07:11:28+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca-gpt4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model vicgalle/gpt2-alpaca-gpt4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-13T08:11:17.165801(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca-gpt4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/gpt2-alpaca-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T08:11:17.165801(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca-gpt4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/gpt2-alpaca-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T08:11:17.165801(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vicgalle/gpt2-alpaca-gpt4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model vicgalle/gpt2-alpaca-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T08:11:17.165801(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
52241514ce578ff40edcf9d7667c8e033a69e7af
|
# Dataset Card for Evaluation run of OptimalScale/robin-7b-v2-delta
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OptimalScale/robin-7b-v2-delta
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [OptimalScale/robin-7b-v2-delta](https://huggingface.co/OptimalScale/robin-7b-v2-delta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OptimalScale__robin-7b-v2-delta",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-04T17:26:25.175957](https://huggingface.co/datasets/open-llm-leaderboard/details_OptimalScale__robin-7b-v2-delta/blob/main/results_2023-08-04T17%3A26%3A25.175957.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3929860483472965,
"acc_stderr": 0.03475011382789092,
"acc_norm": 0.3973144877483442,
"acc_norm_stderr": 0.034741735928325954,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200652,
"mc2": 0.4227251694229852,
"mc2_stderr": 0.014483446210472699
},
"harness|arc:challenge|25": {
"acc": 0.4351535836177474,
"acc_stderr": 0.014487986197186047,
"acc_norm": 0.49146757679180886,
"acc_norm_stderr": 0.01460926316563219
},
"harness|hellaswag|10": {
"acc": 0.5452101175064729,
"acc_stderr": 0.004969341773423513,
"acc_norm": 0.7442740489942242,
"acc_norm_stderr": 0.004353768730644565
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39245283018867927,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.39245283018867927,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.03567603799639169,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.03567603799639169
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03835153954399421,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03835153954399421
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.038783523721386215,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.038783523721386215
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924315,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924315
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4,
"acc_stderr": 0.027869320571664632,
"acc_norm": 0.4,
"acc_norm_stderr": 0.027869320571664632
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114482,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114482
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5212121212121212,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.5212121212121212,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5129533678756477,
"acc_stderr": 0.036072280610477486,
"acc_norm": 0.5129533678756477,
"acc_norm_stderr": 0.036072280610477486
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32564102564102565,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.32564102564102565,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45871559633027525,
"acc_stderr": 0.021364122533881695,
"acc_norm": 0.45871559633027525,
"acc_norm_stderr": 0.021364122533881695
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653063,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653063
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.035050931943487976,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.035050931943487976
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5232067510548524,
"acc_stderr": 0.03251215201141018,
"acc_norm": 0.5232067510548524,
"acc_norm_stderr": 0.03251215201141018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831028,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831028
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.594017094017094,
"acc_stderr": 0.03217180182641086,
"acc_norm": 0.594017094017094,
"acc_norm_stderr": 0.03217180182641086
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5389527458492975,
"acc_stderr": 0.017825621793239012,
"acc_norm": 0.5389527458492975,
"acc_norm_stderr": 0.017825621793239012
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4421965317919075,
"acc_stderr": 0.0267386036438074,
"acc_norm": 0.4421965317919075,
"acc_norm_stderr": 0.0267386036438074
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.01444415780826145,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.01444415780826145
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02791405551046801,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02791405551046801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4405144694533762,
"acc_stderr": 0.028196400574197422,
"acc_norm": 0.4405144694533762,
"acc_norm_stderr": 0.028196400574197422
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327235,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327235
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.02737412888263115,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.02737412888263115
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29465449804432853,
"acc_stderr": 0.011643576764069546,
"acc_norm": 0.29465449804432853,
"acc_norm_stderr": 0.011643576764069546
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3786764705882353,
"acc_stderr": 0.029465133639776132,
"acc_norm": 0.3786764705882353,
"acc_norm_stderr": 0.029465133639776132
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.019506291693954847,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.019506291693954847
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907297,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907297
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3142857142857143,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.3142857142857143,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4527363184079602,
"acc_stderr": 0.03519702717576915,
"acc_norm": 0.4527363184079602,
"acc_norm_stderr": 0.03519702717576915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322416,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322416
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5847953216374269,
"acc_stderr": 0.037792759455032014,
"acc_norm": 0.5847953216374269,
"acc_norm_stderr": 0.037792759455032014
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200652,
"mc2": 0.4227251694229852,
"mc2_stderr": 0.014483446210472699
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_OptimalScale__robin-7b-v2-delta
|
[
"region:us"
] |
2023-08-17T22:51:57+00:00
|
{"pretty_name": "Evaluation run of OptimalScale/robin-7b-v2-delta", "dataset_summary": "Dataset automatically created during the evaluation run of model [OptimalScale/robin-7b-v2-delta](https://huggingface.co/OptimalScale/robin-7b-v2-delta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OptimalScale__robin-7b-v2-delta\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-04T17:26:25.175957](https://huggingface.co/datasets/open-llm-leaderboard/details_OptimalScale__robin-7b-v2-delta/blob/main/results_2023-08-04T17%3A26%3A25.175957.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3929860483472965,\n \"acc_stderr\": 0.03475011382789092,\n \"acc_norm\": 0.3973144877483442,\n \"acc_norm_stderr\": 0.034741735928325954,\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.01559475363200652,\n \"mc2\": 0.4227251694229852,\n \"mc2_stderr\": 0.014483446210472699\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4351535836177474,\n \"acc_stderr\": 0.014487986197186047,\n \"acc_norm\": 0.49146757679180886,\n \"acc_norm_stderr\": 0.01460926316563219\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5452101175064729,\n \"acc_stderr\": 0.004969341773423513,\n \"acc_norm\": 0.7442740489942242,\n \"acc_norm_stderr\": 0.004353768730644565\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.39245283018867927,\n \"acc_stderr\": 0.03005258057955784,\n \"acc_norm\": 0.39245283018867927,\n \"acc_norm_stderr\": 0.03005258057955784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.03567603799639169,\n \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.03567603799639169\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03835153954399421,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03835153954399421\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.038783523721386215,\n \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.038783523721386215\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924315,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924315\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.027869320571664632,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.027869320571664632\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114482,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114482\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5212121212121212,\n \"acc_stderr\": 0.03900828913737302,\n \"acc_norm\": 0.5212121212121212,\n \"acc_norm_stderr\": 0.03900828913737302\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.03547601494006937,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03547601494006937\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5129533678756477,\n \"acc_stderr\": 0.036072280610477486,\n \"acc_norm\": 0.5129533678756477,\n \"acc_norm_stderr\": 0.036072280610477486\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.45871559633027525,\n \"acc_stderr\": 0.021364122533881695,\n \"acc_norm\": 0.45871559633027525,\n \"acc_norm_stderr\": 0.021364122533881695\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653063,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653063\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.47549019607843135,\n \"acc_stderr\": 0.035050931943487976,\n \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.035050931943487976\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5232067510548524,\n \"acc_stderr\": 0.03251215201141018,\n \"acc_norm\": 0.5232067510548524,\n \"acc_norm_stderr\": 0.03251215201141018\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831028,\n \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831028\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.594017094017094,\n \"acc_stderr\": 0.03217180182641086,\n \"acc_norm\": 0.594017094017094,\n \"acc_norm_stderr\": 0.03217180182641086\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5389527458492975,\n \"acc_stderr\": 0.017825621793239012,\n \"acc_norm\": 0.5389527458492975,\n \"acc_norm_stderr\": 0.017825621793239012\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4421965317919075,\n \"acc_stderr\": 0.0267386036438074,\n \"acc_norm\": 0.4421965317919075,\n \"acc_norm_stderr\": 0.0267386036438074\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.01444415780826145,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.01444415780826145\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02791405551046801,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02791405551046801\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4405144694533762,\n \"acc_stderr\": 0.028196400574197422,\n \"acc_norm\": 0.4405144694533762,\n \"acc_norm_stderr\": 0.028196400574197422\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327235,\n \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327235\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.30141843971631205,\n \"acc_stderr\": 0.02737412888263115,\n \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.02737412888263115\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29465449804432853,\n \"acc_stderr\": 0.011643576764069546,\n \"acc_norm\": 0.29465449804432853,\n \"acc_norm_stderr\": 0.011643576764069546\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3786764705882353,\n \"acc_stderr\": 0.029465133639776132,\n \"acc_norm\": 0.3786764705882353,\n \"acc_norm_stderr\": 0.029465133639776132\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.019506291693954847,\n \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.019506291693954847\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4527363184079602,\n \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.4527363184079602,\n \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322416,\n \"acc_norm\": 0.35542168674698793,\n \"acc_norm_stderr\": 0.03726214354322416\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5847953216374269,\n \"acc_stderr\": 0.037792759455032014,\n \"acc_norm\": 0.5847953216374269,\n \"acc_norm_stderr\": 0.037792759455032014\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.01559475363200652,\n \"mc2\": 0.4227251694229852,\n \"mc2_stderr\": 0.014483446210472699\n }\n}\n```", "repo_url": "https://huggingface.co/OptimalScale/robin-7b-v2-delta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|arc:challenge|25_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hellaswag|10_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-04T17:26:25.175957.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-04T17:26:25.175957.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_04T17_26_25.175957", "path": ["results_2023-08-04T17:26:25.175957.parquet"]}, {"split": "latest", "path": ["results_2023-08-04T17:26:25.175957.parquet"]}]}]}
|
2023-08-27T11:24:39+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of OptimalScale/robin-7b-v2-delta
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OptimalScale/robin-7b-v2-delta on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-04T17:26:25.175957 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of OptimalScale/robin-7b-v2-delta",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-7b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-04T17:26:25.175957 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OptimalScale/robin-7b-v2-delta",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-7b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-04T17:26:25.175957 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OptimalScale/robin-7b-v2-delta## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-7b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-04T17:26:25.175957 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
2008d924e19ae4669fc274e94795bfc9ba31b670
|
# Dataset Card for Evaluation run of OptimalScale/robin-13b-v2-delta
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OptimalScale/robin-13b-v2-delta
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [OptimalScale/robin-13b-v2-delta](https://huggingface.co/OptimalScale/robin-13b-v2-delta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-04T18:08:52.244101](https://huggingface.co/datasets/open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta/blob/main/results_2023-08-04T18%3A08%3A52.244101.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48671411429389705,
"acc_stderr": 0.034851524265514446,
"acc_norm": 0.49073578692938213,
"acc_norm_stderr": 0.03483423136146648,
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.5054136576088012,
"mc2_stderr": 0.014772161409527505
},
"harness|arc:challenge|25": {
"acc": 0.537542662116041,
"acc_stderr": 0.014570144495075581,
"acc_norm": 0.5656996587030717,
"acc_norm_stderr": 0.014484703048857364
},
"harness|hellaswag|10": {
"acc": 0.5944035052778331,
"acc_stderr": 0.004900036261309047,
"acc_norm": 0.8035251941844254,
"acc_norm_stderr": 0.003965196368697847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.037786210790920545,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.037786210790920545
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4068965517241379,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.4068965517241379,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113946,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113946
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.037694303145125674,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.037694303145125674
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5606060606060606,
"acc_stderr": 0.03536085947529479,
"acc_norm": 0.5606060606060606,
"acc_norm_stderr": 0.03536085947529479
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6735751295336787,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.6735751295336787,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000756,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000756
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.02564410863926762,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.02564410863926762
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.02050472901382912,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.02050472901382912
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.03085199299325701,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.03085199299325701
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674118,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674118
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955924,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955924
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5695067264573991,
"acc_stderr": 0.0332319730294294,
"acc_norm": 0.5695067264573991,
"acc_norm_stderr": 0.0332319730294294
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.028286324075564397,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.028286324075564397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.016617501738763394,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.016617501738763394
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4884393063583815,
"acc_stderr": 0.02691189868637792,
"acc_norm": 0.4884393063583815,
"acc_norm_stderr": 0.02691189868637792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.02861462475280544,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.02861462475280544
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.02834504586484061,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.02834504586484061
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542595,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542595
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347247,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347247
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906424,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906424
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.0303720158854282,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.0303720158854282
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626923,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.03368787466115459,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.03368787466115459
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.035087719298245626,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.035087719298245626
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3378212974296206,
"mc1_stderr": 0.016557167322516882,
"mc2": 0.5054136576088012,
"mc2_stderr": 0.014772161409527505
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta
|
[
"region:us"
] |
2023-08-17T22:52:06+00:00
|
{"pretty_name": "Evaluation run of OptimalScale/robin-13b-v2-delta", "dataset_summary": "Dataset automatically created during the evaluation run of model [OptimalScale/robin-13b-v2-delta](https://huggingface.co/OptimalScale/robin-13b-v2-delta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-04T18:08:52.244101](https://huggingface.co/datasets/open-llm-leaderboard/details_OptimalScale__robin-13b-v2-delta/blob/main/results_2023-08-04T18%3A08%3A52.244101.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48671411429389705,\n \"acc_stderr\": 0.034851524265514446,\n \"acc_norm\": 0.49073578692938213,\n \"acc_norm_stderr\": 0.03483423136146648,\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.5054136576088012,\n \"mc2_stderr\": 0.014772161409527505\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.537542662116041,\n \"acc_stderr\": 0.014570144495075581,\n \"acc_norm\": 0.5656996587030717,\n \"acc_norm_stderr\": 0.014484703048857364\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5944035052778331,\n \"acc_stderr\": 0.004900036261309047,\n \"acc_norm\": 0.8035251941844254,\n \"acc_norm_stderr\": 0.003965196368697847\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739435,\n \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739435\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.037786210790920545,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.037786210790920545\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266237,\n \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266237\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113946,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113946\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.037694303145125674,\n \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.037694303145125674\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5606060606060606,\n \"acc_stderr\": 0.03536085947529479,\n \"acc_norm\": 0.5606060606060606,\n \"acc_norm_stderr\": 0.03536085947529479\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6735751295336787,\n \"acc_stderr\": 0.033840286211432945,\n \"acc_norm\": 0.6735751295336787,\n \"acc_norm_stderr\": 0.033840286211432945\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000756,\n \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000756\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.02564410863926762,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.02564410863926762\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6458715596330276,\n \"acc_stderr\": 0.02050472901382912,\n \"acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.02050472901382912\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.03085199299325701,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.03085199299325701\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674118,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674118\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5695067264573991,\n \"acc_stderr\": 0.0332319730294294,\n \"acc_norm\": 0.5695067264573991,\n \"acc_norm_stderr\": 0.0332319730294294\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n \"acc_stderr\": 0.028286324075564397,\n \"acc_norm\": 0.7521367521367521,\n \"acc_norm_stderr\": 0.028286324075564397\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n \"acc_stderr\": 0.016617501738763394,\n \"acc_norm\": 0.6845466155810983,\n \"acc_norm_stderr\": 0.016617501738763394\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4884393063583815,\n \"acc_stderr\": 0.02691189868637792,\n \"acc_norm\": 0.4884393063583815,\n \"acc_norm_stderr\": 0.02691189868637792\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.02861462475280544,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.02861462475280544\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n \"acc_stderr\": 0.02834504586484061,\n \"acc_norm\": 0.5305466237942122,\n \"acc_norm_stderr\": 0.02834504586484061\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542595,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542595\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347247,\n \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347247\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n \"acc_stderr\": 0.012593959992906424,\n \"acc_norm\": 0.4172099087353325,\n \"acc_norm_stderr\": 0.012593959992906424\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.0303720158854282,\n \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.0303720158854282\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626923,\n \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626923\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n \"acc_stderr\": 0.03368787466115459,\n \"acc_norm\": 0.6517412935323383,\n \"acc_norm_stderr\": 0.03368787466115459\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.035087719298245626,\n \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.035087719298245626\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3378212974296206,\n \"mc1_stderr\": 0.016557167322516882,\n \"mc2\": 0.5054136576088012,\n \"mc2_stderr\": 0.014772161409527505\n }\n}\n```", "repo_url": "https://huggingface.co/OptimalScale/robin-13b-v2-delta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|arc:challenge|25_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hellaswag|10_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-04T18:08:52.244101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-04T18:08:52.244101.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_04T18_08_52.244101", "path": ["results_2023-08-04T18:08:52.244101.parquet"]}, {"split": "latest", "path": ["results_2023-08-04T18:08:52.244101.parquet"]}]}]}
|
2023-08-27T11:24:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of OptimalScale/robin-13b-v2-delta
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OptimalScale/robin-13b-v2-delta on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-04T18:08:52.244101 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of OptimalScale/robin-13b-v2-delta",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-13b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-04T18:08:52.244101 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OptimalScale/robin-13b-v2-delta",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-13b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-04T18:08:52.244101 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OptimalScale/robin-13b-v2-delta## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-13b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-04T18:08:52.244101 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
390b9ae35dbde078dfefe56d72d9d8164a439322
|
# Dataset Card for Evaluation run of OptimalScale/robin-65b-v2-delta
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OptimalScale/robin-65b-v2-delta
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [OptimalScale/robin-65b-v2-delta](https://huggingface.co/OptimalScale/robin-65b-v2-delta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OptimalScale__robin-65b-v2-delta",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-08T22:36:48.972258](https://huggingface.co/datasets/open-llm-leaderboard/details_OptimalScale__robin-65b-v2-delta/blob/main/results_2023-08-08T22%3A36%3A48.972258.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6078684635354565,
"acc_stderr": 0.03335692759261142,
"acc_norm": 0.6117336666004985,
"acc_norm_stderr": 0.03333764223356639,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.4474047797612128,
"mc2_stderr": 0.014194757681816514
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.014413988396996074,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670731
},
"harness|hellaswag|10": {
"acc": 0.6137223660625374,
"acc_stderr": 0.004859004184694605,
"acc_norm": 0.8161720772754432,
"acc_norm_stderr": 0.0038655217623631624
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728762,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728762
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154333,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594207,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594207
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967294,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967294
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045708,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045708
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.4474047797612128,
"mc2_stderr": 0.014194757681816514
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_OptimalScale__robin-65b-v2-delta
|
[
"region:us"
] |
2023-08-17T22:52:14+00:00
|
{"pretty_name": "Evaluation run of OptimalScale/robin-65b-v2-delta", "dataset_summary": "Dataset automatically created during the evaluation run of model [OptimalScale/robin-65b-v2-delta](https://huggingface.co/OptimalScale/robin-65b-v2-delta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OptimalScale__robin-65b-v2-delta\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-08T22:36:48.972258](https://huggingface.co/datasets/open-llm-leaderboard/details_OptimalScale__robin-65b-v2-delta/blob/main/results_2023-08-08T22%3A36%3A48.972258.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6078684635354565,\n \"acc_stderr\": 0.03335692759261142,\n \"acc_norm\": 0.6117336666004985,\n \"acc_norm_stderr\": 0.03333764223356639,\n \"mc1\": 0.3072215422276622,\n \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.4474047797612128,\n \"mc2_stderr\": 0.014194757681816514\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996074,\n \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670731\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6137223660625374,\n \"acc_stderr\": 0.004859004184694605,\n \"acc_norm\": 0.8161720772754432,\n \"acc_norm_stderr\": 0.0038655217623631624\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715563,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715563\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728762,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728762\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647078,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647078\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154333,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154333\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064077,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064077\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n \"acc_stderr\": 0.014317653708594207,\n \"acc_norm\": 0.7994891443167306,\n \"acc_norm_stderr\": 0.014317653708594207\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967294,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967294\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045708,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045708\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.4474047797612128,\n \"mc2_stderr\": 0.014194757681816514\n }\n}\n```", "repo_url": "https://huggingface.co/OptimalScale/robin-65b-v2-delta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|arc:challenge|25_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hellaswag|10_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-08T22:36:48.972258.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-08T22:36:48.972258.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_08T22_36_48.972258", "path": ["results_2023-08-08T22:36:48.972258.parquet"]}, {"split": "latest", "path": ["results_2023-08-08T22:36:48.972258.parquet"]}]}]}
|
2023-08-27T11:24:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of OptimalScale/robin-65b-v2-delta
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OptimalScale/robin-65b-v2-delta on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-08T22:36:48.972258 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of OptimalScale/robin-65b-v2-delta",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-65b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-08T22:36:48.972258 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OptimalScale/robin-65b-v2-delta",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-65b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-08T22:36:48.972258 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OptimalScale/robin-65b-v2-delta## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OptimalScale/robin-65b-v2-delta on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-08T22:36:48.972258 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
2510d4ece76098ad5255eff17b3acd92ed3b8896
|
# Dataset Card for Evaluation run of HiTZ/alpaca-lora-65b-en-pt-es-ca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HiTZ/alpaca-lora-65b-en-pt-es-ca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [HiTZ/alpaca-lora-65b-en-pt-es-ca](https://huggingface.co/HiTZ/alpaca-lora-65b-en-pt-es-ca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T03:43:44.241616](https://huggingface.co/datasets/open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca/blob/main/results_2023-09-17T03-43-44.241616.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.44924496644295303,
"em_stderr": 0.005094018275255409,
"f1": 0.4984060402684574,
"f1_stderr": 0.004892652635239537,
"acc": 0.5359600711595986,
"acc_stderr": 0.011658939983913114
},
"harness|drop|3": {
"em": 0.44924496644295303,
"em_stderr": 0.005094018275255409,
"f1": 0.4984060402684574,
"f1_stderr": 0.004892652635239537
},
"harness|gsm8k|5": {
"acc": 0.266868840030326,
"acc_stderr": 0.012183780551887955
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938275
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca
|
[
"region:us"
] |
2023-08-17T22:52:23+00:00
|
{"pretty_name": "Evaluation run of HiTZ/alpaca-lora-65b-en-pt-es-ca", "dataset_summary": "Dataset automatically created during the evaluation run of model [HiTZ/alpaca-lora-65b-en-pt-es-ca](https://huggingface.co/HiTZ/alpaca-lora-65b-en-pt-es-ca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T03:43:44.241616](https://huggingface.co/datasets/open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca/blob/main/results_2023-09-17T03-43-44.241616.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.44924496644295303,\n \"em_stderr\": 0.005094018275255409,\n \"f1\": 0.4984060402684574,\n \"f1_stderr\": 0.004892652635239537,\n \"acc\": 0.5359600711595986,\n \"acc_stderr\": 0.011658939983913114\n },\n \"harness|drop|3\": {\n \"em\": 0.44924496644295303,\n \"em_stderr\": 0.005094018275255409,\n \"f1\": 0.4984060402684574,\n \"f1_stderr\": 0.004892652635239537\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.266868840030326,\n \"acc_stderr\": 0.012183780551887955\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938275\n }\n}\n```", "repo_url": "https://huggingface.co/HiTZ/alpaca-lora-65b-en-pt-es-ca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|arc:challenge|25_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T03_43_44.241616", "path": ["**/details_harness|drop|3_2023-09-17T03-43-44.241616.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T03-43-44.241616.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T03_43_44.241616", "path": ["**/details_harness|gsm8k|5_2023-09-17T03-43-44.241616.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T03-43-44.241616.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hellaswag|10_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-04T23:39:25.347647.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-04T23:39:25.347647.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-04T23:39:25.347647.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T03_43_44.241616", "path": ["**/details_harness|winogrande|5_2023-09-17T03-43-44.241616.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T03-43-44.241616.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_04T23_39_25.347647", "path": ["results_2023-08-04T23:39:25.347647.parquet"]}, {"split": "2023_09_17T03_43_44.241616", "path": ["results_2023-09-17T03-43-44.241616.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T03-43-44.241616.parquet"]}]}]}
|
2023-09-17T02:43:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of HiTZ/alpaca-lora-65b-en-pt-es-ca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model HiTZ/alpaca-lora-65b-en-pt-es-ca on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T03:43:44.241616(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of HiTZ/alpaca-lora-65b-en-pt-es-ca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model HiTZ/alpaca-lora-65b-en-pt-es-ca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T03:43:44.241616(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of HiTZ/alpaca-lora-65b-en-pt-es-ca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model HiTZ/alpaca-lora-65b-en-pt-es-ca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T03:43:44.241616(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of HiTZ/alpaca-lora-65b-en-pt-es-ca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model HiTZ/alpaca-lora-65b-en-pt-es-ca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T03:43:44.241616(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9b271dbe21b615505be9dedc096b3e7961f546c0
|
# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardLM-13B-V1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardLM-13B-V1.2](https://huggingface.co/WizardLM/WizardLM-13B-V1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T23:07:01.737511](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2/blob/main/results_2023-10-18T23-07-01.737511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.09133808724832215,
"em_stderr": 0.002950304012601038,
"f1": 0.1617292365771806,
"f1_stderr": 0.0032231699829319426,
"acc": 0.4269860152120696,
"acc_stderr": 0.011021928189223498
},
"harness|drop|3": {
"em": 0.09133808724832215,
"em_stderr": 0.002950304012601038,
"f1": 0.1617292365771806,
"f1_stderr": 0.0032231699829319426
},
"harness|gsm8k|5": {
"acc": 0.13495072024260804,
"acc_stderr": 0.009411315282571171
},
"harness|winogrande|5": {
"acc": 0.7190213101815311,
"acc_stderr": 0.012632541095875825
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2
|
[
"region:us"
] |
2023-08-17T22:52:31+00:00
|
{"pretty_name": "Evaluation run of WizardLM/WizardLM-13B-V1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [WizardLM/WizardLM-13B-V1.2](https://huggingface.co/WizardLM/WizardLM-13B-V1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T23:07:01.737511](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.2/blob/main/results_2023-10-18T23-07-01.737511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.09133808724832215,\n \"em_stderr\": 0.002950304012601038,\n \"f1\": 0.1617292365771806,\n \"f1_stderr\": 0.0032231699829319426,\n \"acc\": 0.4269860152120696,\n \"acc_stderr\": 0.011021928189223498\n },\n \"harness|drop|3\": {\n \"em\": 0.09133808724832215,\n \"em_stderr\": 0.002950304012601038,\n \"f1\": 0.1617292365771806,\n \"f1_stderr\": 0.0032231699829319426\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13495072024260804,\n \"acc_stderr\": 0.009411315282571171\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7190213101815311,\n \"acc_stderr\": 0.012632541095875825\n }\n}\n```", "repo_url": "https://huggingface.co/WizardLM/WizardLM-13B-V1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|arc:challenge|25_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T23_07_01.737511", "path": ["**/details_harness|drop|3_2023-10-18T23-07-01.737511.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T23-07-01.737511.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T23_07_01.737511", "path": ["**/details_harness|gsm8k|5_2023-10-18T23-07-01.737511.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T23-07-01.737511.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hellaswag|10_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T14:20:40.943670.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T14:20:40.943670.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T14:20:40.943670.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T23_07_01.737511", "path": ["**/details_harness|winogrande|5_2023-10-18T23-07-01.737511.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T23-07-01.737511.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_26T14_20_40.943670", "path": ["results_2023-07-26T14:20:40.943670.parquet"]}, {"split": "2023_10_18T23_07_01.737511", "path": ["results_2023-10-18T23-07-01.737511.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T23-07-01.737511.parquet"]}]}]}
|
2023-10-18T22:07:13+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WizardLM/WizardLM-13B-V1.2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T23:07:01.737511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-13B-V1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T23:07:01.737511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-13B-V1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T23:07:01.737511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-13B-V1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T23:07:01.737511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e1fc8735352f81451c02d1234dd06c24ee87b667
|
# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardLM-13B-V1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardLM-13B-V1.1](https://huggingface.co/WizardLM/WizardLM-13B-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T23:24:10.120000](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1/blob/main/results_2023-10-28T23-24-10.120000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24213506711409397,
"em_stderr": 0.004386967355552305,
"f1": 0.3075335570469809,
"f1_stderr": 0.0043568221171957165,
"acc": 0.41585700582764323,
"acc_stderr": 0.00984029249742667
},
"harness|drop|3": {
"em": 0.24213506711409397,
"em_stderr": 0.004386967355552305,
"f1": 0.3075335570469809,
"f1_stderr": 0.0043568221171957165
},
"harness|gsm8k|5": {
"acc": 0.08112206216830932,
"acc_stderr": 0.007520395797922653
},
"harness|winogrande|5": {
"acc": 0.7505919494869772,
"acc_stderr": 0.012160189196930687
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1
|
[
"region:us"
] |
2023-08-17T22:52:40+00:00
|
{"pretty_name": "Evaluation run of WizardLM/WizardLM-13B-V1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [WizardLM/WizardLM-13B-V1.1](https://huggingface.co/WizardLM/WizardLM-13B-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T23:24:10.120000](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-13B-V1.1/blob/main/results_2023-10-28T23-24-10.120000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24213506711409397,\n \"em_stderr\": 0.004386967355552305,\n \"f1\": 0.3075335570469809,\n \"f1_stderr\": 0.0043568221171957165,\n \"acc\": 0.41585700582764323,\n \"acc_stderr\": 0.00984029249742667\n },\n \"harness|drop|3\": {\n \"em\": 0.24213506711409397,\n \"em_stderr\": 0.004386967355552305,\n \"f1\": 0.3075335570469809,\n \"f1_stderr\": 0.0043568221171957165\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08112206216830932,\n \"acc_stderr\": 0.007520395797922653\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.012160189196930687\n }\n}\n```", "repo_url": "https://huggingface.co/WizardLM/WizardLM-13B-V1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|arc:challenge|25_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T15_36_34.719946", "path": ["**/details_harness|drop|3_2023-10-18T15-36-34.719946.parquet"]}, {"split": "2023_10_28T23_24_10.120000", "path": ["**/details_harness|drop|3_2023-10-28T23-24-10.120000.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T23-24-10.120000.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T15_36_34.719946", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-36-34.719946.parquet"]}, {"split": "2023_10_28T23_24_10.120000", "path": ["**/details_harness|gsm8k|5_2023-10-28T23-24-10.120000.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T23-24-10.120000.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hellaswag|10_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T14:16:17.821348.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T14:16:17.821348.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T14:16:17.821348.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T15_36_34.719946", "path": ["**/details_harness|winogrande|5_2023-10-18T15-36-34.719946.parquet"]}, {"split": "2023_10_28T23_24_10.120000", "path": ["**/details_harness|winogrande|5_2023-10-28T23-24-10.120000.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T23-24-10.120000.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_26T14_16_17.821348", "path": ["results_2023-07-26T14:16:17.821348.parquet"]}, {"split": "2023_10_18T15_36_34.719946", "path": ["results_2023-10-18T15-36-34.719946.parquet"]}, {"split": "2023_10_28T23_24_10.120000", "path": ["results_2023-10-28T23-24-10.120000.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T23-24-10.120000.parquet"]}]}]}
|
2023-10-28T22:24:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WizardLM/WizardLM-13B-V1.1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T23:24:10.120000(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-13B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T23:24:10.120000(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-13B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T23:24:10.120000(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WizardLM/WizardLM-13B-V1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-13B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T23:24:10.120000(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
bd65f1e5a0026afe78d2eb3001da48a0484eaaf0
|
# Dataset Card for Evaluation run of WizardLM/WizardCoder-15B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardCoder-15B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardCoder-15B-V1.0](https://huggingface.co/WizardLM/WizardCoder-15B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T05:39:03.080415](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0/blob/main/results_2023-10-18T05-39-03.080415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10171979865771812,
"em_stderr": 0.003095624755865799,
"f1": 0.1654624580536908,
"f1_stderr": 0.0033020160713134682,
"acc": 0.2864625625234491,
"acc_stderr": 0.008973810218487487
},
"harness|drop|3": {
"em": 0.10171979865771812,
"em_stderr": 0.003095624755865799,
"f1": 0.1654624580536908,
"f1_stderr": 0.0033020160713134682
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
},
"harness|winogrande|5": {
"acc": 0.5516969218626677,
"acc_stderr": 0.013977171307126338
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0
|
[
"region:us"
] |
2023-08-17T22:52:49+00:00
|
{"pretty_name": "Evaluation run of WizardLM/WizardCoder-15B-V1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [WizardLM/WizardCoder-15B-V1.0](https://huggingface.co/WizardLM/WizardCoder-15B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T05:39:03.080415](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0/blob/main/results_2023-10-18T05-39-03.080415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10171979865771812,\n \"em_stderr\": 0.003095624755865799,\n \"f1\": 0.1654624580536908,\n \"f1_stderr\": 0.0033020160713134682,\n \"acc\": 0.2864625625234491,\n \"acc_stderr\": 0.008973810218487487\n },\n \"harness|drop|3\": {\n \"em\": 0.10171979865771812,\n \"em_stderr\": 0.003095624755865799,\n \"f1\": 0.1654624580536908,\n \"f1_stderr\": 0.0033020160713134682\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \"acc_stderr\": 0.003970449129848635\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5516969218626677,\n \"acc_stderr\": 0.013977171307126338\n }\n}\n```", "repo_url": "https://huggingface.co/WizardLM/WizardCoder-15B-V1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|arc:challenge|25_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T05_39_03.080415", "path": ["**/details_harness|drop|3_2023-10-18T05-39-03.080415.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T05-39-03.080415.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T05_39_03.080415", "path": ["**/details_harness|gsm8k|5_2023-10-18T05-39-03.080415.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T05-39-03.080415.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hellaswag|10_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T20:24:20.327625.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T20:24:20.327625.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T20:24:20.327625.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T05_39_03.080415", "path": ["**/details_harness|winogrande|5_2023-10-18T05-39-03.080415.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T05-39-03.080415.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T20_24_20.327625", "path": ["results_2023-07-19T20:24:20.327625.parquet"]}, {"split": "2023_10_18T05_39_03.080415", "path": ["results_2023-10-18T05-39-03.080415.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T05-39-03.080415.parquet"]}]}]}
|
2023-10-18T04:39:15+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of WizardLM/WizardCoder-15B-V1.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WizardLM/WizardCoder-15B-V1.0 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T05:39:03.080415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of WizardLM/WizardCoder-15B-V1.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardCoder-15B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T05:39:03.080415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WizardLM/WizardCoder-15B-V1.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardCoder-15B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T05:39:03.080415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WizardLM/WizardCoder-15B-V1.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardCoder-15B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T05:39:03.080415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
aa590211fd1c03f2cd868a5dae195771855446f8
|
# Dataset Card for Evaluation run of WizardLM/WizardLM-70B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardLM-70B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardLM-70B-V1.0](https://huggingface.co/WizardLM/WizardLM-70B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardLM-70B-V1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T19:05:20.351798](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-70B-V1.0/blob/main/results_2023-10-15T19-05-20.351798.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.26541526845637586,
"em_stderr": 0.004521927044730418,
"f1": 0.3270648070469802,
"f1_stderr": 0.004444377320494032,
"acc": 0.49394497158582623,
"acc_stderr": 0.010820164814450885
},
"harness|drop|3": {
"em": 0.26541526845637586,
"em_stderr": 0.004521927044730418,
"f1": 0.3270648070469802,
"f1_stderr": 0.004444377320494032
},
"harness|gsm8k|5": {
"acc": 0.17968157695223655,
"acc_stderr": 0.010575119964242244
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_WizardLM__WizardLM-70B-V1.0
|
[
"region:us"
] |
2023-08-17T22:52:58+00:00
|
{"pretty_name": "Evaluation run of WizardLM/WizardLM-70B-V1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [WizardLM/WizardLM-70B-V1.0](https://huggingface.co/WizardLM/WizardLM-70B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardLM-70B-V1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T19:05:20.351798](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardLM-70B-V1.0/blob/main/results_2023-10-15T19-05-20.351798.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.26541526845637586,\n \"em_stderr\": 0.004521927044730418,\n \"f1\": 0.3270648070469802,\n \"f1_stderr\": 0.004444377320494032,\n \"acc\": 0.49394497158582623,\n \"acc_stderr\": 0.010820164814450885\n },\n \"harness|drop|3\": {\n \"em\": 0.26541526845637586,\n \"em_stderr\": 0.004521927044730418,\n \"f1\": 0.3270648070469802,\n \"f1_stderr\": 0.004444377320494032\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17968157695223655,\n \"acc_stderr\": 0.010575119964242244\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n }\n}\n```", "repo_url": "https://huggingface.co/WizardLM/WizardLM-70B-V1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|arc:challenge|25_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T19_05_20.351798", "path": ["**/details_harness|drop|3_2023-10-15T19-05-20.351798.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T19-05-20.351798.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T19_05_20.351798", "path": ["**/details_harness|gsm8k|5_2023-10-15T19-05-20.351798.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T19-05-20.351798.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hellaswag|10_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T00:01:57.828467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T00:01:57.828467.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T00:01:57.828467.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T19_05_20.351798", "path": ["**/details_harness|winogrande|5_2023-10-15T19-05-20.351798.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T19-05-20.351798.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T00_01_57.828467", "path": ["results_2023-08-17T00:01:57.828467.parquet"]}, {"split": "2023_10_15T19_05_20.351798", "path": ["results_2023-10-15T19-05-20.351798.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T19-05-20.351798.parquet"]}]}]}
|
2023-10-15T18:05:32+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of WizardLM/WizardLM-70B-V1.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WizardLM/WizardLM-70B-V1.0 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T19:05:20.351798(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of WizardLM/WizardLM-70B-V1.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-70B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T19:05:20.351798(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WizardLM/WizardLM-70B-V1.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-70B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T19:05:20.351798(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WizardLM/WizardLM-70B-V1.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model WizardLM/WizardLM-70B-V1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T19:05:20.351798(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b7b584a1a714b060955dc520b6d5731c308ca315
|
# Dataset Card for Evaluation run of jarradh/llama2_70b_chat_uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jarradh/llama2_70b_chat_uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jarradh/llama2_70b_chat_uncensored](https://huggingface.co/jarradh/llama2_70b_chat_uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jarradh__llama2_70b_chat_uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T07:51:05.565296](https://huggingface.co/datasets/open-llm-leaderboard/details_jarradh__llama2_70b_chat_uncensored/blob/main/results_2023-10-13T07-51-05.565296.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.14586828859060402,
"em_stderr": 0.003614785389347219,
"f1": 0.2008619966442949,
"f1_stderr": 0.0036435562383754947,
"acc": 0.5640370566063477,
"acc_stderr": 0.011658866017842285
},
"harness|drop|3": {
"em": 0.14586828859060402,
"em_stderr": 0.003614785389347219,
"f1": 0.2008619966442949,
"f1_stderr": 0.0036435562383754947
},
"harness|gsm8k|5": {
"acc": 0.3025018953752843,
"acc_stderr": 0.012652544133186129
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498442
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jarradh__llama2_70b_chat_uncensored
|
[
"region:us"
] |
2023-08-17T22:53:07+00:00
|
{"pretty_name": "Evaluation run of jarradh/llama2_70b_chat_uncensored", "dataset_summary": "Dataset automatically created during the evaluation run of model [jarradh/llama2_70b_chat_uncensored](https://huggingface.co/jarradh/llama2_70b_chat_uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jarradh__llama2_70b_chat_uncensored\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T07:51:05.565296](https://huggingface.co/datasets/open-llm-leaderboard/details_jarradh__llama2_70b_chat_uncensored/blob/main/results_2023-10-13T07-51-05.565296.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14586828859060402,\n \"em_stderr\": 0.003614785389347219,\n \"f1\": 0.2008619966442949,\n \"f1_stderr\": 0.0036435562383754947,\n \"acc\": 0.5640370566063477,\n \"acc_stderr\": 0.011658866017842285\n },\n \"harness|drop|3\": {\n \"em\": 0.14586828859060402,\n \"em_stderr\": 0.003614785389347219,\n \"f1\": 0.2008619966442949,\n \"f1_stderr\": 0.0036435562383754947\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3025018953752843,\n \"acc_stderr\": 0.012652544133186129\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498442\n }\n}\n```", "repo_url": "https://huggingface.co/jarradh/llama2_70b_chat_uncensored", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T07_51_05.565296", "path": ["**/details_harness|drop|3_2023-10-13T07-51-05.565296.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T07-51-05.565296.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T07_51_05.565296", "path": ["**/details_harness|gsm8k|5_2023-10-13T07-51-05.565296.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T07-51-05.565296.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:41:26.455015.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:41:26.455015.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:41:26.455015.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T07_51_05.565296", "path": ["**/details_harness|winogrande|5_2023-10-13T07-51-05.565296.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T07-51-05.565296.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T17_41_26.455015", "path": ["results_2023-08-09T17:41:26.455015.parquet"]}, {"split": "2023_10_13T07_51_05.565296", "path": ["results_2023-10-13T07-51-05.565296.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T07-51-05.565296.parquet"]}]}]}
|
2023-10-13T06:51:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jarradh/llama2_70b_chat_uncensored
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jarradh/llama2_70b_chat_uncensored on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-13T07:51:05.565296(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jarradh/llama2_70b_chat_uncensored",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jarradh/llama2_70b_chat_uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T07:51:05.565296(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jarradh/llama2_70b_chat_uncensored",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jarradh/llama2_70b_chat_uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T07:51:05.565296(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jarradh/llama2_70b_chat_uncensored## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jarradh/llama2_70b_chat_uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T07:51:05.565296(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8c6d42a406d3539a9c514bbcda44527af5984d9e
|
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CobraMamba/mamba-gpt-3b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-3b-v2](https://huggingface.co/CobraMamba/mamba-gpt-3b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-27T10:38:04.478796](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v2/blob/main/results_2023-07-27T10%3A38%3A04.478796.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2773460846321751,
"acc_stderr": 0.03238661474119926,
"acc_norm": 0.2811003145526316,
"acc_norm_stderr": 0.03238192601606109,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871112,
"mc2": 0.36736867775864135,
"mc2_stderr": 0.013883449396602417
},
"harness|arc:challenge|25": {
"acc": 0.386518771331058,
"acc_stderr": 0.01423008476191048,
"acc_norm": 0.42150170648464164,
"acc_norm_stderr": 0.014430197069326023
},
"harness|hellaswag|10": {
"acc": 0.5284803823939455,
"acc_stderr": 0.004981680090303706,
"acc_norm": 0.7149970125473013,
"acc_norm_stderr": 0.0045049329997363914
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03459777606810536,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03459777606810536
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.02737770662467071,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.02737770662467071
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1791907514450867,
"acc_stderr": 0.029242513059063294,
"acc_norm": 0.1791907514450867,
"acc_norm_stderr": 0.029242513059063294
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.030976692998534443,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.030976692998534443
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669416,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669416
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.19310344827586207,
"acc_stderr": 0.03289445522127401,
"acc_norm": 0.19310344827586207,
"acc_norm_stderr": 0.03289445522127401
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113946,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113946
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.02865749128507198,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.02865749128507198
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.018819182034850068,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.018819182034850068
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.026991454502036733,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.026991454502036733
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.030236389942173092,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.030236389942173092
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3090676883780332,
"acc_stderr": 0.016524988919702197,
"acc_norm": 0.3090676883780332,
"acc_norm_stderr": 0.016524988919702197
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343957,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343957
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24771838331160365,
"acc_stderr": 0.011025499291443742,
"acc_norm": 0.24771838331160365,
"acc_norm_stderr": 0.011025499291443742
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.023157468308559366,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.023157468308559366
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.018120224251484587,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.018120224251484587
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.04494290866252088,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.04494290866252088
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2653061224489796,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.2653061224489796,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573044,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.036293353299478595,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.036293353299478595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871112,
"mc2": 0.36736867775864135,
"mc2_stderr": 0.013883449396602417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v2
|
[
"region:us"
] |
2023-08-17T22:53:16+00:00
|
{"pretty_name": "Evaluation run of CobraMamba/mamba-gpt-3b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-3b-v2](https://huggingface.co/CobraMamba/mamba-gpt-3b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v2\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-27T10:38:04.478796](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v2/blob/main/results_2023-07-27T10%3A38%3A04.478796.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2773460846321751,\n \"acc_stderr\": 0.03238661474119926,\n \"acc_norm\": 0.2811003145526316,\n \"acc_norm_stderr\": 0.03238192601606109,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871112,\n \"mc2\": 0.36736867775864135,\n \"mc2_stderr\": 0.013883449396602417\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.386518771331058,\n \"acc_stderr\": 0.01423008476191048,\n \"acc_norm\": 0.42150170648464164,\n \"acc_norm_stderr\": 0.014430197069326023\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5284803823939455,\n \"acc_stderr\": 0.004981680090303706,\n \"acc_norm\": 0.7149970125473013,\n \"acc_norm_stderr\": 0.0045049329997363914\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03459777606810536,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03459777606810536\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.02737770662467071,\n \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.02737770662467071\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1791907514450867,\n \"acc_stderr\": 0.029242513059063294,\n \"acc_norm\": 0.1791907514450867,\n \"acc_norm_stderr\": 0.029242513059063294\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.030976692998534443,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.030976692998534443\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669416,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669416\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.19310344827586207,\n \"acc_stderr\": 0.03289445522127401,\n \"acc_norm\": 0.19310344827586207,\n \"acc_norm_stderr\": 0.03289445522127401\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113946,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113946\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479047,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479047\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180363,\n \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180363\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.02865749128507198,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.02865749128507198\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26055045871559634,\n \"acc_stderr\": 0.018819182034850068,\n \"acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.018819182034850068\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.19444444444444445,\n \"acc_stderr\": 0.026991454502036733,\n \"acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.026991454502036733\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.35874439461883406,\n \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352167,\n \"acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352167\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326468,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326468\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3076923076923077,\n \"acc_stderr\": 0.030236389942173092,\n \"acc_norm\": 0.3076923076923077,\n \"acc_norm_stderr\": 0.030236389942173092\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3090676883780332,\n \"acc_stderr\": 0.016524988919702197,\n \"acc_norm\": 0.3090676883780332,\n \"acc_norm_stderr\": 0.016524988919702197\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343957,\n \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343957\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24771838331160365,\n \"acc_stderr\": 0.011025499291443742,\n \"acc_norm\": 0.24771838331160365,\n \"acc_norm_stderr\": 0.011025499291443742\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.023157468308559366,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.023157468308559366\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.018120224251484587,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.018120224251484587\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.04494290866252088,\n \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.04494290866252088\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2653061224489796,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.2653061224489796,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n \"acc_stderr\": 0.030965903123573044,\n \"acc_norm\": 0.25870646766169153,\n \"acc_norm_stderr\": 0.030965903123573044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.036293353299478595,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.036293353299478595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871112,\n \"mc2\": 0.36736867775864135,\n \"mc2_stderr\": 0.013883449396602417\n }\n}\n```", "repo_url": "https://huggingface.co/CobraMamba/mamba-gpt-3b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|arc:challenge|25_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hellaswag|10_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T10:38:04.478796.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T10:38:04.478796.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_27T10_38_04.478796", "path": ["results_2023-07-27T10:38:04.478796.parquet"]}, {"split": "latest", "path": ["results_2023-07-27T10:38:04.478796.parquet"]}]}]}
|
2023-08-27T11:24:54+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b-v2 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-07-27T10:38:04.478796 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-27T10:38:04.478796 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-27T10:38:04.478796 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-27T10:38:04.478796 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c90ade3692d5256c65a12eedebca2ce67b69bb7a
|
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CobraMamba/mamba-gpt-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-3b](https://huggingface.co/CobraMamba/mamba-gpt-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T18:11:22.900658](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b/blob/main/results_2023-10-14T18-11-22.900658.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.030935402684563757,
"em_stderr": 0.001773143572150097,
"f1": 0.08682676174496619,
"f1_stderr": 0.002178422368740777,
"acc": 0.32631481001667695,
"acc_stderr": 0.007357115747858964
},
"harness|drop|3": {
"em": 0.030935402684563757,
"em_stderr": 0.001773143572150097,
"f1": 0.08682676174496619,
"f1_stderr": 0.002178422368740777
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.001312157814867419
},
"harness|winogrande|5": {
"acc": 0.6503551696921863,
"acc_stderr": 0.013402073680850508
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b
|
[
"region:us"
] |
2023-08-17T22:53:24+00:00
|
{"pretty_name": "Evaluation run of CobraMamba/mamba-gpt-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-3b](https://huggingface.co/CobraMamba/mamba-gpt-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T18:11:22.900658](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b/blob/main/results_2023-10-14T18-11-22.900658.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.030935402684563757,\n \"em_stderr\": 0.001773143572150097,\n \"f1\": 0.08682676174496619,\n \"f1_stderr\": 0.002178422368740777,\n \"acc\": 0.32631481001667695,\n \"acc_stderr\": 0.007357115747858964\n },\n \"harness|drop|3\": {\n \"em\": 0.030935402684563757,\n \"em_stderr\": 0.001773143572150097,\n \"f1\": 0.08682676174496619,\n \"f1_stderr\": 0.002178422368740777\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.001312157814867419\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6503551696921863,\n \"acc_stderr\": 0.013402073680850508\n }\n}\n```", "repo_url": "https://huggingface.co/CobraMamba/mamba-gpt-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T18_11_22.900658", "path": ["**/details_harness|drop|3_2023-10-14T18-11-22.900658.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T18-11-22.900658.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T18_11_22.900658", "path": ["**/details_harness|gsm8k|5_2023-10-14T18-11-22.900658.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T18-11-22.900658.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:20:46.724343.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:20:46.724343.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:20:46.724343.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T18_11_22.900658", "path": ["**/details_harness|winogrande|5_2023-10-14T18-11-22.900658.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T18-11-22.900658.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_20_46.724343", "path": ["results_2023-07-19T15:20:46.724343.parquet"]}, {"split": "2023_10_14T18_11_22.900658", "path": ["results_2023-10-14T18-11-22.900658.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T18-11-22.900658.parquet"]}]}]}
|
2023-10-14T17:12:30+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-14T18:11:22.900658(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T18:11:22.900658(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T18:11:22.900658(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T18:11:22.900658(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c122254c55af22dc828765655976d57bdfd1173d
|
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CobraMamba/mamba-gpt-3b-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-3b-v3](https://huggingface.co/CobraMamba/mamba-gpt-3b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T22:45:10.282154](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v3/blob/main/results_2023-09-22T22-45-10.282154.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006816275167785235,
"em_stderr": 0.0008426127095859263,
"f1": 0.06266883389261749,
"f1_stderr": 0.0015405842311829617,
"acc": 0.34347640848673355,
"acc_stderr": 0.008090409857327651
},
"harness|drop|3": {
"em": 0.006816275167785235,
"em_stderr": 0.0008426127095859263,
"f1": 0.06266883389261749,
"f1_stderr": 0.0015405842311829617
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.0030152942428909465
},
"harness|winogrande|5": {
"acc": 0.6748224151539068,
"acc_stderr": 0.013165525471764358
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v3
|
[
"region:us"
] |
2023-08-17T22:53:33+00:00
|
{"pretty_name": "Evaluation run of CobraMamba/mamba-gpt-3b-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-3b-v3](https://huggingface.co/CobraMamba/mamba-gpt-3b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T22:45:10.282154](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-3b-v3/blob/main/results_2023-09-22T22-45-10.282154.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006816275167785235,\n \"em_stderr\": 0.0008426127095859263,\n \"f1\": 0.06266883389261749,\n \"f1_stderr\": 0.0015405842311829617,\n \"acc\": 0.34347640848673355,\n \"acc_stderr\": 0.008090409857327651\n },\n \"harness|drop|3\": {\n \"em\": 0.006816275167785235,\n \"em_stderr\": 0.0008426127095859263,\n \"f1\": 0.06266883389261749,\n \"f1_stderr\": 0.0015405842311829617\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \"acc_stderr\": 0.0030152942428909465\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6748224151539068,\n \"acc_stderr\": 0.013165525471764358\n }\n}\n```", "repo_url": "https://huggingface.co/CobraMamba/mamba-gpt-3b-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|arc:challenge|25_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T22_45_10.282154", "path": ["**/details_harness|drop|3_2023-09-22T22-45-10.282154.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T22-45-10.282154.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T22_45_10.282154", "path": ["**/details_harness|gsm8k|5_2023-09-22T22-45-10.282154.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T22-45-10.282154.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hellaswag|10_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T14:05:11.940524.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T14:05:11.940524.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T14:05:11.940524.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T22_45_10.282154", "path": ["**/details_harness|winogrande|5_2023-09-22T22-45-10.282154.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T22-45-10.282154.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T14_05_11.940524", "path": ["results_2023-08-01T14:05:11.940524.parquet"]}, {"split": "2023_09_22T22_45_10.282154", "path": ["results_2023-09-22T22-45-10.282154.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T22-45-10.282154.parquet"]}]}]}
|
2023-09-22T21:45:22+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b-v3 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T22:45:10.282154(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T22:45:10.282154(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T22:45:10.282154(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-3b-v3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CobraMamba/mamba-gpt-3b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T22:45:10.282154(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
15ef60e7063badfc4de20d7d9f5be3a980b8a085
|
# Dataset Card for Evaluation run of HuggingFaceH4/starchat-beta
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HuggingFaceH4/starchat-beta
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [HuggingFaceH4/starchat-beta](https://huggingface.co/HuggingFaceH4/starchat-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HuggingFaceH4__starchat-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T02:34:18.811369](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceH4__starchat-beta/blob/main/results_2023-10-25T02-34-18.811369.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004089765100671141,
"em_stderr": 0.0006535802669912854,
"f1": 0.0753890520134228,
"f1_stderr": 0.001735962486740498,
"acc": 0.3742380352004251,
"acc_stderr": 0.00950380770894865
},
"harness|drop|3": {
"em": 0.004089765100671141,
"em_stderr": 0.0006535802669912854,
"f1": 0.0753890520134228,
"f1_stderr": 0.001735962486740498
},
"harness|gsm8k|5": {
"acc": 0.05155420773313116,
"acc_stderr": 0.006090887955262828
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634472
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_HuggingFaceH4__starchat-beta
|
[
"region:us"
] |
2023-08-17T22:53:41+00:00
|
{"pretty_name": "Evaluation run of HuggingFaceH4/starchat-beta", "dataset_summary": "Dataset automatically created during the evaluation run of model [HuggingFaceH4/starchat-beta](https://huggingface.co/HuggingFaceH4/starchat-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HuggingFaceH4__starchat-beta\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T02:34:18.811369](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceH4__starchat-beta/blob/main/results_2023-10-25T02-34-18.811369.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004089765100671141,\n \"em_stderr\": 0.0006535802669912854,\n \"f1\": 0.0753890520134228,\n \"f1_stderr\": 0.001735962486740498,\n \"acc\": 0.3742380352004251,\n \"acc_stderr\": 0.00950380770894865\n },\n \"harness|drop|3\": {\n \"em\": 0.004089765100671141,\n \"em_stderr\": 0.0006535802669912854,\n \"f1\": 0.0753890520134228,\n \"f1_stderr\": 0.001735962486740498\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05155420773313116,\n \"acc_stderr\": 0.006090887955262828\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634472\n }\n}\n```", "repo_url": "https://huggingface.co/HuggingFaceH4/starchat-beta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T02_34_18.811369", "path": ["**/details_harness|drop|3_2023-10-25T02-34-18.811369.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T02-34-18.811369.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T02_34_18.811369", "path": ["**/details_harness|gsm8k|5_2023-10-25T02-34-18.811369.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T02-34-18.811369.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T21:08:27.330071.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:08:27.330071.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T21:08:27.330071.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T02_34_18.811369", "path": ["**/details_harness|winogrande|5_2023-10-25T02-34-18.811369.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T02-34-18.811369.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T21_08_27.330071", "path": ["results_2023-07-19T21:08:27.330071.parquet"]}, {"split": "2023_10_25T02_34_18.811369", "path": ["results_2023-10-25T02-34-18.811369.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T02-34-18.811369.parquet"]}]}]}
|
2023-10-25T01:34:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of HuggingFaceH4/starchat-beta
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model HuggingFaceH4/starchat-beta on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-25T02:34:18.811369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of HuggingFaceH4/starchat-beta",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model HuggingFaceH4/starchat-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T02:34:18.811369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of HuggingFaceH4/starchat-beta",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model HuggingFaceH4/starchat-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T02:34:18.811369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of HuggingFaceH4/starchat-beta## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model HuggingFaceH4/starchat-beta on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T02:34:18.811369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
1ddda86b88a46d095bf8682fc70559d9b6739d45
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama-65b-v8-bf16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-llama-65b-v8-bf16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-llama-65b-v8-bf16](https://huggingface.co/OpenBuddy/openbuddy-llama-65b-v8-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-llama-65b-v8-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T12:02:25.830257](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama-65b-v8-bf16/blob/main/results_2023-10-15T12-02-25.830257.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.45952181208053694,
"em_stderr": 0.005103660800968606,
"f1": 0.5149674916107397,
"f1_stderr": 0.0048640509430360815,
"acc": 0.6165941527298491,
"acc_stderr": 0.01245134316413488
},
"harness|drop|3": {
"em": 0.45952181208053694,
"em_stderr": 0.005103660800968606,
"f1": 0.5149674916107397,
"f1_stderr": 0.0048640509430360815
},
"harness|gsm8k|5": {
"acc": 0.4336618650492798,
"acc_stderr": 0.013650728047064681
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205078
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_OpenBuddy__openbuddy-llama-65b-v8-bf16
|
[
"region:us"
] |
2023-08-17T22:53:50+00:00
|
{"pretty_name": "Evaluation run of OpenBuddy/openbuddy-llama-65b-v8-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-llama-65b-v8-bf16](https://huggingface.co/OpenBuddy/openbuddy-llama-65b-v8-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-llama-65b-v8-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T12:02:25.830257](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama-65b-v8-bf16/blob/main/results_2023-10-15T12-02-25.830257.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.45952181208053694,\n \"em_stderr\": 0.005103660800968606,\n \"f1\": 0.5149674916107397,\n \"f1_stderr\": 0.0048640509430360815,\n \"acc\": 0.6165941527298491,\n \"acc_stderr\": 0.01245134316413488\n },\n \"harness|drop|3\": {\n \"em\": 0.45952181208053694,\n \"em_stderr\": 0.005103660800968606,\n \"f1\": 0.5149674916107397,\n \"f1_stderr\": 0.0048640509430360815\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4336618650492798,\n \"acc_stderr\": 0.013650728047064681\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205078\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-llama-65b-v8-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|arc:challenge|25_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T12_02_25.830257", "path": ["**/details_harness|drop|3_2023-10-15T12-02-25.830257.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T12-02-25.830257.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T12_02_25.830257", "path": ["**/details_harness|gsm8k|5_2023-10-15T12-02-25.830257.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T12-02-25.830257.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hellaswag|10_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T21:10:34.846324.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T21:10:34.846324.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T21:10:34.846324.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T12_02_25.830257", "path": ["**/details_harness|winogrande|5_2023-10-15T12-02-25.830257.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T12-02-25.830257.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_02T21_10_34.846324", "path": ["results_2023-08-02T21:10:34.846324.parquet"]}, {"split": "2023_10_15T12_02_25.830257", "path": ["results_2023-10-15T12-02-25.830257.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T12-02-25.830257.parquet"]}]}]}
|
2023-10-15T11:02:38+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama-65b-v8-bf16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-llama-65b-v8-bf16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T12:02:25.830257(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama-65b-v8-bf16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-llama-65b-v8-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T12:02:25.830257(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama-65b-v8-bf16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-llama-65b-v8-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T12:02:25.830257(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama-65b-v8-bf16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-llama-65b-v8-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T12:02:25.830257(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
66bbadb7e112c04dbe3497782e327e2444cb50b5
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama2-13b-v8.1-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v8.1-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-llama2-13b-v8.1-fp16](https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v8.1-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v8.1-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T03:02:04.830771](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v8.1-fp16/blob/main/results_2023-10-17T03-02-04.830771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.35245385906040266,
"em_stderr": 0.004892447185660923,
"f1": 0.40581585570469886,
"f1_stderr": 0.004773989571169136,
"acc": 0.5233743005661293,
"acc_stderr": 0.012467575336089344
},
"harness|drop|3": {
"em": 0.35245385906040266,
"em_stderr": 0.004892447185660923,
"f1": 0.40581585570469886,
"f1_stderr": 0.004773989571169136
},
"harness|gsm8k|5": {
"acc": 0.3032600454890068,
"acc_stderr": 0.012661502663418698
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759989
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v8.1-fp16
|
[
"region:us"
] |
2023-08-17T22:53:58+00:00
|
{"pretty_name": "Evaluation run of OpenBuddy/openbuddy-llama2-13b-v8.1-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-llama2-13b-v8.1-fp16](https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v8.1-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v8.1-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T03:02:04.830771](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-llama2-13b-v8.1-fp16/blob/main/results_2023-10-17T03-02-04.830771.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.35245385906040266,\n \"em_stderr\": 0.004892447185660923,\n \"f1\": 0.40581585570469886,\n \"f1_stderr\": 0.004773989571169136,\n \"acc\": 0.5233743005661293,\n \"acc_stderr\": 0.012467575336089344\n },\n \"harness|drop|3\": {\n \"em\": 0.35245385906040266,\n \"em_stderr\": 0.004892447185660923,\n \"f1\": 0.40581585570469886,\n \"f1_stderr\": 0.004773989571169136\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3032600454890068,\n \"acc_stderr\": 0.012661502663418698\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759989\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-llama2-13b-v8.1-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T03_02_04.830771", "path": ["**/details_harness|drop|3_2023-10-17T03-02-04.830771.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T03-02-04.830771.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T03_02_04.830771", "path": ["**/details_harness|gsm8k|5_2023-10-17T03-02-04.830771.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T03-02-04.830771.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:11:44.944856.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:11:44.944856.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:11:44.944856.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T03_02_04.830771", "path": ["**/details_harness|winogrande|5_2023-10-17T03-02-04.830771.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T03-02-04.830771.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T10_11_44.944856", "path": ["results_2023-07-25T10:11:44.944856.parquet"]}, {"split": "2023_10_17T03_02_04.830771", "path": ["results_2023-10-17T03-02-04.830771.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T03-02-04.830771.parquet"]}]}]}
|
2023-10-17T02:02:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama2-13b-v8.1-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-llama2-13b-v8.1-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T03:02:04.830771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama2-13b-v8.1-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-llama2-13b-v8.1-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T03:02:04.830771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama2-13b-v8.1-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-llama2-13b-v8.1-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T03:02:04.830771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenBuddy/openbuddy-llama2-13b-v8.1-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-llama2-13b-v8.1-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T03:02:04.830771(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
905c189c7c1b25c2991ce14937b9494fd1798043
|
# Dataset Card for Evaluation run of bigcode/gpt_bigcode-santacoder
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigcode/gpt_bigcode-santacoder
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigcode/gpt_bigcode-santacoder](https://huggingface.co/bigcode/gpt_bigcode-santacoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__gpt_bigcode-santacoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T12:23:19.324032](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__gpt_bigcode-santacoder/blob/main/results_2023-09-17T12-23-19.324032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413059,
"f1": 0.03720532718120814,
"f1_stderr": 0.0010858123513473891,
"acc": 0.2418011181367818,
"acc_stderr": 0.008020272468716342
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413059,
"f1": 0.03720532718120814,
"f1_stderr": 0.0010858123513473891
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480557
},
"harness|winogrande|5": {
"acc": 0.47829518547750594,
"acc_stderr": 0.014039239216484629
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_bigcode__gpt_bigcode-santacoder
|
[
"region:us"
] |
2023-08-17T22:54:07+00:00
|
{"pretty_name": "Evaluation run of bigcode/gpt_bigcode-santacoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigcode/gpt_bigcode-santacoder](https://huggingface.co/bigcode/gpt_bigcode-santacoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__gpt_bigcode-santacoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T12:23:19.324032](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__gpt_bigcode-santacoder/blob/main/results_2023-09-17T12-23-19.324032.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413059,\n \"f1\": 0.03720532718120814,\n \"f1_stderr\": 0.0010858123513473891,\n \"acc\": 0.2418011181367818,\n \"acc_stderr\": 0.008020272468716342\n },\n \"harness|drop|3\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413059,\n \"f1\": 0.03720532718120814,\n \"f1_stderr\": 0.0010858123513473891\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.0020013057209480557\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.47829518547750594,\n \"acc_stderr\": 0.014039239216484629\n }\n}\n```", "repo_url": "https://huggingface.co/bigcode/gpt_bigcode-santacoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T12_23_19.324032", "path": ["**/details_harness|drop|3_2023-09-17T12-23-19.324032.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T12-23-19.324032.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T12_23_19.324032", "path": ["**/details_harness|gsm8k|5_2023-09-17T12-23-19.324032.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T12-23-19.324032.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:43.434285.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:05:43.434285.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:05:43.434285.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T12_23_19.324032", "path": ["**/details_harness|winogrande|5_2023-09-17T12-23-19.324032.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T12-23-19.324032.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_05_43.434285", "path": ["results_2023-07-19T19:05:43.434285.parquet"]}, {"split": "2023_09_17T12_23_19.324032", "path": ["results_2023-09-17T12-23-19.324032.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T12-23-19.324032.parquet"]}]}]}
|
2023-09-17T11:23:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of bigcode/gpt_bigcode-santacoder
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model bigcode/gpt_bigcode-santacoder on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T12:23:19.324032(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of bigcode/gpt_bigcode-santacoder",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigcode/gpt_bigcode-santacoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T12:23:19.324032(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigcode/gpt_bigcode-santacoder",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigcode/gpt_bigcode-santacoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T12:23:19.324032(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bigcode/gpt_bigcode-santacoder## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigcode/gpt_bigcode-santacoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T12:23:19.324032(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
202d836c64d8fb0f555a161f6e422c88d3ba2657
|
# Dataset Card for Evaluation run of bigcode/tiny_starcoder_py
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigcode/tiny_starcoder_py
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigcode/tiny_starcoder_py](https://huggingface.co/bigcode/tiny_starcoder_py) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__tiny_starcoder_py",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T18:41:27.030233](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__tiny_starcoder_py/blob/main/results_2023-09-17T18-41-27.030233.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335755,
"f1": 0.015742449664429566,
"f1_stderr": 0.0006568370194517889,
"acc": 0.2610447871046265,
"acc_stderr": 0.00838467769872364
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335755,
"f1": 0.015742449664429566,
"f1_stderr": 0.0006568370194517889
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.00272107657704166
},
"harness|winogrande|5": {
"acc": 0.5122336227308603,
"acc_stderr": 0.01404827882040562
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_bigcode__tiny_starcoder_py
|
[
"region:us"
] |
2023-08-17T22:54:15+00:00
|
{"pretty_name": "Evaluation run of bigcode/tiny_starcoder_py", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigcode/tiny_starcoder_py](https://huggingface.co/bigcode/tiny_starcoder_py) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__tiny_starcoder_py\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T18:41:27.030233](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__tiny_starcoder_py/blob/main/results_2023-09-17T18-41-27.030233.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335755,\n \"f1\": 0.015742449664429566,\n \"f1_stderr\": 0.0006568370194517889,\n \"acc\": 0.2610447871046265,\n \"acc_stderr\": 0.00838467769872364\n },\n \"harness|drop|3\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335755,\n \"f1\": 0.015742449664429566,\n \"f1_stderr\": 0.0006568370194517889\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.00272107657704166\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5122336227308603,\n \"acc_stderr\": 0.01404827882040562\n }\n}\n```", "repo_url": "https://huggingface.co/bigcode/tiny_starcoder_py", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T18_41_27.030233", "path": ["**/details_harness|drop|3_2023-09-17T18-41-27.030233.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T18-41-27.030233.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T18_41_27.030233", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-41-27.030233.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T18-41-27.030233.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:53:24.895112.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:53:24.895112.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:53:24.895112.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T18_41_27.030233", "path": ["**/details_harness|winogrande|5_2023-09-17T18-41-27.030233.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T18-41-27.030233.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_53_24.895112", "path": ["results_2023-07-19T18:53:24.895112.parquet"]}, {"split": "2023_09_17T18_41_27.030233", "path": ["results_2023-09-17T18-41-27.030233.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T18-41-27.030233.parquet"]}]}]}
|
2023-09-17T17:41:39+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of bigcode/tiny_starcoder_py
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model bigcode/tiny_starcoder_py on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T18:41:27.030233(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of bigcode/tiny_starcoder_py",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigcode/tiny_starcoder_py on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T18:41:27.030233(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigcode/tiny_starcoder_py",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigcode/tiny_starcoder_py on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T18:41:27.030233(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bigcode/tiny_starcoder_py## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigcode/tiny_starcoder_py on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T18:41:27.030233(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f3b504ef7978e96258af6e9c8a1fada1df0adf6a
|
# Dataset Card for Evaluation run of golaxy/gogpt-7b-bloom
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/golaxy/gogpt-7b-bloom
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [golaxy/gogpt-7b-bloom](https://huggingface.co/golaxy/gogpt-7b-bloom) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_golaxy__gogpt-7b-bloom",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T21:01:38.341280](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt-7b-bloom/blob/main/results_2023-10-14T21-01-38.341280.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2214765100671141,
"em_stderr": 0.004252451287967787,
"f1": 0.25772336409395996,
"f1_stderr": 0.00428261897007673,
"acc": 0.31452249408050514,
"acc_stderr": 0.006788199951115784
},
"harness|drop|3": {
"em": 0.2214765100671141,
"em_stderr": 0.004252451287967787,
"f1": 0.25772336409395996,
"f1_stderr": 0.00428261897007673
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6290449881610103,
"acc_stderr": 0.013576399902231568
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_golaxy__gogpt-7b-bloom
|
[
"region:us"
] |
2023-08-17T22:54:23+00:00
|
{"pretty_name": "Evaluation run of golaxy/gogpt-7b-bloom", "dataset_summary": "Dataset automatically created during the evaluation run of model [golaxy/gogpt-7b-bloom](https://huggingface.co/golaxy/gogpt-7b-bloom) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_golaxy__gogpt-7b-bloom\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T21:01:38.341280](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt-7b-bloom/blob/main/results_2023-10-14T21-01-38.341280.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2214765100671141,\n \"em_stderr\": 0.004252451287967787,\n \"f1\": 0.25772336409395996,\n \"f1_stderr\": 0.00428261897007673,\n \"acc\": 0.31452249408050514,\n \"acc_stderr\": 0.006788199951115784\n },\n \"harness|drop|3\": {\n \"em\": 0.2214765100671141,\n \"em_stderr\": 0.004252451287967787,\n \"f1\": 0.25772336409395996,\n \"f1_stderr\": 0.00428261897007673\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6290449881610103,\n \"acc_stderr\": 0.013576399902231568\n }\n}\n```", "repo_url": "https://huggingface.co/golaxy/gogpt-7b-bloom", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|arc:challenge|25_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T07_35_20.075381", "path": ["**/details_harness|drop|3_2023-09-17T07-35-20.075381.parquet"]}, {"split": "2023_10_14T21_01_38.341280", "path": ["**/details_harness|drop|3_2023-10-14T21-01-38.341280.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T21-01-38.341280.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T07_35_20.075381", "path": ["**/details_harness|gsm8k|5_2023-09-17T07-35-20.075381.parquet"]}, {"split": "2023_10_14T21_01_38.341280", "path": ["**/details_harness|gsm8k|5_2023-10-14T21-01-38.341280.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T21-01-38.341280.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hellaswag|10_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T10:56:27.356745.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T10:56:27.356745.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T10:56:27.356745.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T07_35_20.075381", "path": ["**/details_harness|winogrande|5_2023-09-17T07-35-20.075381.parquet"]}, {"split": "2023_10_14T21_01_38.341280", "path": ["**/details_harness|winogrande|5_2023-10-14T21-01-38.341280.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T21-01-38.341280.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T10_56_27.356745", "path": ["results_2023-07-31T10:56:27.356745.parquet"]}, {"split": "2023_09_17T07_35_20.075381", "path": ["results_2023-09-17T07-35-20.075381.parquet"]}, {"split": "2023_10_14T21_01_38.341280", "path": ["results_2023-10-14T21-01-38.341280.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T21-01-38.341280.parquet"]}]}]}
|
2023-10-14T20:01:45+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of golaxy/gogpt-7b-bloom
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model golaxy/gogpt-7b-bloom on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-14T21:01:38.341280(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of golaxy/gogpt-7b-bloom",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-7b-bloom on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T21:01:38.341280(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of golaxy/gogpt-7b-bloom",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-7b-bloom on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T21:01:38.341280(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of golaxy/gogpt-7b-bloom## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-7b-bloom on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T21:01:38.341280(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
06934beb45d4a63c8d004e70ab4be7a9feda3da7
|
# Dataset Card for Evaluation run of golaxy/gowizardlm
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/golaxy/gowizardlm
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [golaxy/gowizardlm](https://huggingface.co/golaxy/gowizardlm) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_golaxy__gowizardlm_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-06T19:14:17.905225](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gowizardlm_public/blob/main/results_2023-11-06T19-14-17.905225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.23185822147651006,
"em_stderr": 0.0043218689587152855,
"f1": 0.298545511744967,
"f1_stderr": 0.004326284929017432,
"acc": 0.36777820129932687,
"acc_stderr": 0.009143244752913006
},
"harness|drop|3": {
"em": 0.23185822147651006,
"em_stderr": 0.0043218689587152855,
"f1": 0.298545511744967,
"f1_stderr": 0.004326284929017432
},
"harness|gsm8k|5": {
"acc": 0.039423805913570885,
"acc_stderr": 0.00536028003034243
},
"harness|winogrande|5": {
"acc": 0.6961325966850829,
"acc_stderr": 0.012926209475483582
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_golaxy__gowizardlm
|
[
"region:us"
] |
2023-08-17T22:54:32+00:00
|
{"pretty_name": "Evaluation run of golaxy/gowizardlm", "dataset_summary": "Dataset automatically created during the evaluation run of model [golaxy/gowizardlm](https://huggingface.co/golaxy/gowizardlm) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_golaxy__gowizardlm_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-06T19:14:17.905225](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gowizardlm_public/blob/main/results_2023-11-06T19-14-17.905225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.23185822147651006,\n \"em_stderr\": 0.0043218689587152855,\n \"f1\": 0.298545511744967,\n \"f1_stderr\": 0.004326284929017432,\n \"acc\": 0.36777820129932687,\n \"acc_stderr\": 0.009143244752913006\n },\n \"harness|drop|3\": {\n \"em\": 0.23185822147651006,\n \"em_stderr\": 0.0043218689587152855,\n \"f1\": 0.298545511744967,\n \"f1_stderr\": 0.004326284929017432\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.039423805913570885,\n \"acc_stderr\": 0.00536028003034243\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6961325966850829,\n \"acc_stderr\": 0.012926209475483582\n }\n}\n```", "repo_url": "https://huggingface.co/golaxy/gowizardlm", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_05T03_40_43.055256", "path": ["**/details_harness|drop|3_2023-11-05T03-40-43.055256.parquet"]}, {"split": "2023_11_06T19_14_17.905225", "path": ["**/details_harness|drop|3_2023-11-06T19-14-17.905225.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-06T19-14-17.905225.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_05T03_40_43.055256", "path": ["**/details_harness|gsm8k|5_2023-11-05T03-40-43.055256.parquet"]}, {"split": "2023_11_06T19_14_17.905225", "path": ["**/details_harness|gsm8k|5_2023-11-06T19-14-17.905225.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-06T19-14-17.905225.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_05T03_40_43.055256", "path": ["**/details_harness|winogrande|5_2023-11-05T03-40-43.055256.parquet"]}, {"split": "2023_11_06T19_14_17.905225", "path": ["**/details_harness|winogrande|5_2023-11-06T19-14-17.905225.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-06T19-14-17.905225.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_05T03_40_43.055256", "path": ["results_2023-11-05T03-40-43.055256.parquet"]}, {"split": "2023_11_06T19_14_17.905225", "path": ["results_2023-11-06T19-14-17.905225.parquet"]}, {"split": "latest", "path": ["results_2023-11-06T19-14-17.905225.parquet"]}]}]}
|
2023-12-01T14:17:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of golaxy/gowizardlm
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model golaxy/gowizardlm on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-06T19:14:17.905225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of golaxy/gowizardlm",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gowizardlm on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-06T19:14:17.905225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of golaxy/gowizardlm",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gowizardlm on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-06T19:14:17.905225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
16,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of golaxy/gowizardlm## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gowizardlm on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-06T19:14:17.905225(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9cc627710b1aa09be9a933f71a6e0920f0375e13
|
# Dataset Card for Evaluation run of golaxy/gogpt-3b-bloom
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/golaxy/gogpt-3b-bloom
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [golaxy/gogpt-3b-bloom](https://huggingface.co/golaxy/gogpt-3b-bloom) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_golaxy__gogpt-3b-bloom",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T23:51:32.147275](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt-3b-bloom/blob/main/results_2023-09-17T23-51-32.147275.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.055893456375838924,
"em_stderr": 0.0023525054515005604,
"f1": 0.10482697147651004,
"f1_stderr": 0.002612068875793489,
"acc": 0.2726602811318757,
"acc_stderr": 0.007535116479736793
},
"harness|drop|3": {
"em": 0.055893456375838924,
"em_stderr": 0.0023525054515005604,
"f1": 0.10482697147651004,
"f1_stderr": 0.002612068875793489
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492658
},
"harness|winogrande|5": {
"acc": 0.5438042620363063,
"acc_stderr": 0.01399845361092432
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_golaxy__gogpt-3b-bloom
|
[
"region:us"
] |
2023-08-17T22:54:41+00:00
|
{"pretty_name": "Evaluation run of golaxy/gogpt-3b-bloom", "dataset_summary": "Dataset automatically created during the evaluation run of model [golaxy/gogpt-3b-bloom](https://huggingface.co/golaxy/gogpt-3b-bloom) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_golaxy__gogpt-3b-bloom\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T23:51:32.147275](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt-3b-bloom/blob/main/results_2023-09-17T23-51-32.147275.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.055893456375838924,\n \"em_stderr\": 0.0023525054515005604,\n \"f1\": 0.10482697147651004,\n \"f1_stderr\": 0.002612068875793489,\n \"acc\": 0.2726602811318757,\n \"acc_stderr\": 0.007535116479736793\n },\n \"harness|drop|3\": {\n \"em\": 0.055893456375838924,\n \"em_stderr\": 0.0023525054515005604,\n \"f1\": 0.10482697147651004,\n \"f1_stderr\": 0.002612068875793489\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492658\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5438042620363063,\n \"acc_stderr\": 0.01399845361092432\n }\n}\n```", "repo_url": "https://huggingface.co/golaxy/gogpt-3b-bloom", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|arc:challenge|25_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T23_51_32.147275", "path": ["**/details_harness|drop|3_2023-09-17T23-51-32.147275.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T23-51-32.147275.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T23_51_32.147275", "path": ["**/details_harness|gsm8k|5_2023-09-17T23-51-32.147275.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T23-51-32.147275.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hellaswag|10_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T10:49:20.036877.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T10:49:20.036877.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T10:49:20.036877.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T23_51_32.147275", "path": ["**/details_harness|winogrande|5_2023-09-17T23-51-32.147275.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T23-51-32.147275.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T10_49_20.036877", "path": ["results_2023-07-31T10:49:20.036877.parquet"]}, {"split": "2023_09_17T23_51_32.147275", "path": ["results_2023-09-17T23-51-32.147275.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T23-51-32.147275.parquet"]}]}]}
|
2023-09-17T22:51:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of golaxy/gogpt-3b-bloom
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model golaxy/gogpt-3b-bloom on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T23:51:32.147275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of golaxy/gogpt-3b-bloom",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-3b-bloom on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T23:51:32.147275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of golaxy/gogpt-3b-bloom",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-3b-bloom on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T23:51:32.147275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of golaxy/gogpt-3b-bloom## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-3b-bloom on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T23:51:32.147275(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
74f7161ebda5b9c07fd4f0412518c0607fabc4c1
|
# Dataset Card for Evaluation run of golaxy/gogpt-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/golaxy/gogpt-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [golaxy/gogpt-7b](https://huggingface.co/golaxy/gogpt-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_golaxy__gogpt-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T19:51:01.588923](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt-7b/blob/main/results_2023-10-16T19-51-01.588923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.27936241610738255,
"em_stderr": 0.0045949638030960225,
"f1": 0.3275786493288599,
"f1_stderr": 0.004570156166276547,
"acc": 0.358332440746709,
"acc_stderr": 0.008331639351329504
},
"harness|drop|3": {
"em": 0.27936241610738255,
"em_stderr": 0.0045949638030960225,
"f1": 0.3275786493288599,
"f1_stderr": 0.004570156166276547
},
"harness|gsm8k|5": {
"acc": 0.018953752843062926,
"acc_stderr": 0.0037560783410314704
},
"harness|winogrande|5": {
"acc": 0.6977111286503551,
"acc_stderr": 0.012907200361627538
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_golaxy__gogpt-7b
|
[
"region:us"
] |
2023-08-17T22:54:50+00:00
|
{"pretty_name": "Evaluation run of golaxy/gogpt-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [golaxy/gogpt-7b](https://huggingface.co/golaxy/gogpt-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_golaxy__gogpt-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T19:51:01.588923](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt-7b/blob/main/results_2023-10-16T19-51-01.588923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.27936241610738255,\n \"em_stderr\": 0.0045949638030960225,\n \"f1\": 0.3275786493288599,\n \"f1_stderr\": 0.004570156166276547,\n \"acc\": 0.358332440746709,\n \"acc_stderr\": 0.008331639351329504\n },\n \"harness|drop|3\": {\n \"em\": 0.27936241610738255,\n \"em_stderr\": 0.0045949638030960225,\n \"f1\": 0.3275786493288599,\n \"f1_stderr\": 0.004570156166276547\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.018953752843062926,\n \"acc_stderr\": 0.0037560783410314704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6977111286503551,\n \"acc_stderr\": 0.012907200361627538\n }\n}\n```", "repo_url": "https://huggingface.co/golaxy/gogpt-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T19_51_01.588923", "path": ["**/details_harness|drop|3_2023-10-16T19-51-01.588923.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T19-51-01.588923.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T19_51_01.588923", "path": ["**/details_harness|gsm8k|5_2023-10-16T19-51-01.588923.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T19-51-01.588923.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:32:55.056664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:32:55.056664.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:32:55.056664.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T19_51_01.588923", "path": ["**/details_harness|winogrande|5_2023-10-16T19-51-01.588923.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T19-51-01.588923.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T11_32_55.056664", "path": ["results_2023-07-24T11:32:55.056664.parquet"]}, {"split": "2023_10_16T19_51_01.588923", "path": ["results_2023-10-16T19-51-01.588923.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T19-51-01.588923.parquet"]}]}]}
|
2023-10-16T18:51:14+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of golaxy/gogpt-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model golaxy/gogpt-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T19:51:01.588923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of golaxy/gogpt-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T19:51:01.588923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of golaxy/gogpt-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T19:51:01.588923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of golaxy/gogpt-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T19:51:01.588923(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ff6a7d53a47aa27aa7b72143901fa7c811430d22
|
# Dataset Card for Evaluation run of golaxy/gogpt2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/golaxy/gogpt2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [golaxy/gogpt2-7b](https://huggingface.co/golaxy/gogpt2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_golaxy__gogpt2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T07:23:11.777709](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt2-7b/blob/main/results_2023-10-15T07-23-11.777709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24223993288590603,
"em_stderr": 0.004387613635437493,
"f1": 0.31276426174496713,
"f1_stderr": 0.004371100902815104,
"acc": 0.3547029541525623,
"acc_stderr": 0.008571566367248823
},
"harness|drop|3": {
"em": 0.24223993288590603,
"em_stderr": 0.004387613635437493,
"f1": 0.31276426174496713,
"f1_stderr": 0.004371100902815104
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749675
},
"harness|winogrande|5": {
"acc": 0.6866614048934491,
"acc_stderr": 0.01303651209674797
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_golaxy__gogpt2-7b
|
[
"region:us"
] |
2023-08-17T22:54:58+00:00
|
{"pretty_name": "Evaluation run of golaxy/gogpt2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [golaxy/gogpt2-7b](https://huggingface.co/golaxy/gogpt2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_golaxy__gogpt2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T07:23:11.777709](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__gogpt2-7b/blob/main/results_2023-10-15T07-23-11.777709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24223993288590603,\n \"em_stderr\": 0.004387613635437493,\n \"f1\": 0.31276426174496713,\n \"f1_stderr\": 0.004371100902815104,\n \"acc\": 0.3547029541525623,\n \"acc_stderr\": 0.008571566367248823\n },\n \"harness|drop|3\": {\n \"em\": 0.24223993288590603,\n \"em_stderr\": 0.004387613635437493,\n \"f1\": 0.31276426174496713,\n \"f1_stderr\": 0.004371100902815104\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \"acc_stderr\": 0.004106620637749675\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6866614048934491,\n \"acc_stderr\": 0.01303651209674797\n }\n}\n```", "repo_url": "https://huggingface.co/golaxy/gogpt2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|arc:challenge|25_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T07_23_11.777709", "path": ["**/details_harness|drop|3_2023-10-15T07-23-11.777709.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T07-23-11.777709.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T07_23_11.777709", "path": ["**/details_harness|gsm8k|5_2023-10-15T07-23-11.777709.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T07-23-11.777709.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hellaswag|10_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T19:03:01.849561.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T19:03:01.849561.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T19:03:01.849561.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T07_23_11.777709", "path": ["**/details_harness|winogrande|5_2023-10-15T07-23-11.777709.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T07-23-11.777709.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_26T19_03_01.849561", "path": ["results_2023-07-26T19:03:01.849561.parquet"]}, {"split": "2023_10_15T07_23_11.777709", "path": ["results_2023-10-15T07-23-11.777709.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T07-23-11.777709.parquet"]}]}]}
|
2023-10-15T06:23:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of golaxy/gogpt2-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model golaxy/gogpt2-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T07:23:11.777709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of golaxy/gogpt2-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T07:23:11.777709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of golaxy/gogpt2-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T07:23:11.777709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of golaxy/gogpt2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/gogpt2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T07:23:11.777709(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b6ce1aa4bd138600d9becfff051813066db415b3
|
# Dataset Card for Evaluation run of golaxy/goims
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/golaxy/goims
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [golaxy/goims](https://huggingface.co/golaxy/goims) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_golaxy__goims",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T02:18:23.733040](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__goims/blob/main/results_2023-10-17T02-18-23.733040.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.061753355704698,
"f1_stderr": 0.0014402452492549395,
"acc": 0.379924161053344,
"acc_stderr": 0.009802745022083587
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.061753355704698,
"f1_stderr": 0.0014402452492549395
},
"harness|gsm8k|5": {
"acc": 0.06292645943896892,
"acc_stderr": 0.006688762581532711
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634463
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_golaxy__goims
|
[
"region:us"
] |
2023-08-17T22:55:07+00:00
|
{"pretty_name": "Evaluation run of golaxy/goims", "dataset_summary": "Dataset automatically created during the evaluation run of model [golaxy/goims](https://huggingface.co/golaxy/goims) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_golaxy__goims\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T02:18:23.733040](https://huggingface.co/datasets/open-llm-leaderboard/details_golaxy__goims/blob/main/results_2023-10-17T02-18-23.733040.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.061753355704698,\n \"f1_stderr\": 0.0014402452492549395,\n \"acc\": 0.379924161053344,\n \"acc_stderr\": 0.009802745022083587\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.061753355704698,\n \"f1_stderr\": 0.0014402452492549395\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06292645943896892,\n \"acc_stderr\": 0.006688762581532711\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634463\n }\n}\n```", "repo_url": "https://huggingface.co/golaxy/goims", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T02_18_23.733040", "path": ["**/details_harness|drop|3_2023-10-17T02-18-23.733040.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T02-18-23.733040.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T02_18_23.733040", "path": ["**/details_harness|gsm8k|5_2023-10-17T02-18-23.733040.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T02-18-23.733040.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:57:12.922580.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:57:12.922580.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:57:12.922580.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T02_18_23.733040", "path": ["**/details_harness|winogrande|5_2023-10-17T02-18-23.733040.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T02-18-23.733040.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T10_57_12.922580", "path": ["results_2023-08-09T10:57:12.922580.parquet"]}, {"split": "2023_10_17T02_18_23.733040", "path": ["results_2023-10-17T02-18-23.733040.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T02-18-23.733040.parquet"]}]}]}
|
2023-10-17T01:18:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of golaxy/goims
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model golaxy/goims on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T02:18:23.733040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of golaxy/goims",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/goims on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T02:18:23.733040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of golaxy/goims",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/goims on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T02:18:23.733040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
15,
31,
163,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of golaxy/goims## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model golaxy/goims on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T02:18:23.733040(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
fb4f6923455a53f07afc290805b41af73dd198a9
|
# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B-plus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MayaPH/GodziLLa-30B-plus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [MayaPH/GodziLLa-30B-plus](https://huggingface.co/MayaPH/GodziLLa-30B-plus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MayaPH__GodziLLa-30B-plus",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-02T15:59:28.566104](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__GodziLLa-30B-plus/blob/main/results_2023-08-02T15%3A59%3A28.566104.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24241616689179757,
"acc_stderr": 0.031123746423736127,
"acc_norm": 0.2433585961477068,
"acc_norm_stderr": 0.0311383188083218,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862661,
"mc2": 0.4855571913341874,
"mc2_stderr": 0.01696461298096417
},
"harness|arc:challenge|25": {
"acc": 0.23720136518771331,
"acc_stderr": 0.012430399829260844,
"acc_norm": 0.29180887372013653,
"acc_norm_stderr": 0.013284525292403511
},
"harness|hellaswag|10": {
"acc": 0.25253933479386576,
"acc_stderr": 0.0043358096144803055,
"acc_norm": 0.2535351523600876,
"acc_norm_stderr": 0.004341454841892327
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.033176727875331574,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.033176727875331574
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2037735849056604,
"acc_stderr": 0.024790784501775395,
"acc_norm": 0.2037735849056604,
"acc_norm_stderr": 0.024790784501775395
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2,
"acc_stderr": 0.033333333333333305,
"acc_norm": 0.2,
"acc_norm_stderr": 0.033333333333333305
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302054,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302054
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22580645161290322,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.22580645161290322,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.02922557589248962,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.02922557589248962
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041156,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041156
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.02047323317355198,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.02047323317355198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.02720537153827946,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.02720537153827946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804725,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804725
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.22685185185185186,
"acc_stderr": 0.028561650102422273,
"acc_norm": 0.22685185185185186,
"acc_norm_stderr": 0.028561650102422273
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29901960784313725,
"acc_stderr": 0.03213325717373616,
"acc_norm": 0.29901960784313725,
"acc_norm_stderr": 0.03213325717373616
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753378,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753378
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.03006958487449403,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.03006958487449403
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.30578512396694213,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.30578512396694213,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943353,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943353
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952686,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952686
},
"harness|hendrycksTest-management|5": {
"acc": 0.14563106796116504,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.14563106796116504,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.01549108895149459,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.01549108895149459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553969,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553969
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.023287685312334806,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.023287685312334806
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2685788787483703,
"acc_stderr": 0.011320056629121727,
"acc_norm": 0.2685788787483703,
"acc_norm_stderr": 0.011320056629121727
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.025767252010855956,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.025767252010855956
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.02540930195322568,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.02540930195322568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.029475250236017204,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.029475250236017204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.03711725190740749,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.03711725190740749
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.03301405946987251,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.03301405946987251
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862661,
"mc2": 0.4855571913341874,
"mc2_stderr": 0.01696461298096417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_MayaPH__GodziLLa-30B-plus
|
[
"region:us"
] |
2023-08-17T22:55:16+00:00
|
{"pretty_name": "Evaluation run of MayaPH/GodziLLa-30B-plus", "dataset_summary": "Dataset automatically created during the evaluation run of model [MayaPH/GodziLLa-30B-plus](https://huggingface.co/MayaPH/GodziLLa-30B-plus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__GodziLLa-30B-plus\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-02T15:59:28.566104](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__GodziLLa-30B-plus/blob/main/results_2023-08-02T15%3A59%3A28.566104.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24241616689179757,\n \"acc_stderr\": 0.031123746423736127,\n \"acc_norm\": 0.2433585961477068,\n \"acc_norm_stderr\": 0.0311383188083218,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": 0.4855571913341874,\n \"mc2_stderr\": 0.01696461298096417\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.23720136518771331,\n \"acc_stderr\": 0.012430399829260844,\n \"acc_norm\": 0.29180887372013653,\n \"acc_norm_stderr\": 0.013284525292403511\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25253933479386576,\n \"acc_stderr\": 0.0043358096144803055,\n \"acc_norm\": 0.2535351523600876,\n \"acc_norm_stderr\": 0.004341454841892327\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.033176727875331574,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.033176727875331574\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2037735849056604,\n \"acc_stderr\": 0.024790784501775395,\n \"acc_norm\": 0.2037735849056604,\n \"acc_norm_stderr\": 0.024790784501775395\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.033333333333333305,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.033333333333333305\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.03512207412302054,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.03512207412302054\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22580645161290322,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.22580645161290322,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.02922557589248962,\n \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.02922557589248962\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.1919191919191919,\n \"acc_stderr\": 0.02805779167298901,\n \"acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.02805779167298901\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041156,\n \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041156\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20512820512820512,\n \"acc_stderr\": 0.02047323317355198,\n \"acc_norm\": 0.20512820512820512,\n \"acc_norm_stderr\": 0.02047323317355198\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.02720537153827946,\n \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.02720537153827946\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804725,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804725\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.22685185185185186,\n \"acc_stderr\": 0.028561650102422273,\n \"acc_norm\": 0.22685185185185186,\n \"acc_norm_stderr\": 0.028561650102422273\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.29901960784313725,\n \"acc_stderr\": 0.03213325717373616,\n \"acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.03213325717373616\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753378,\n \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753378\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n \"acc_stderr\": 0.03006958487449403,\n \"acc_norm\": 0.27802690582959644,\n \"acc_norm_stderr\": 0.03006958487449403\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.30578512396694213,\n \"acc_stderr\": 0.042059539338841226,\n \"acc_norm\": 0.30578512396694213,\n \"acc_norm_stderr\": 0.042059539338841226\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943353,\n \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943353\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952686,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952686\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.14563106796116504,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.14563106796116504,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n \"acc_stderr\": 0.01549108895149459,\n \"acc_norm\": 0.2503192848020434,\n \"acc_norm_stderr\": 0.01549108895149459\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n \"acc_stderr\": 0.014508979453553969,\n \"acc_norm\": 0.25139664804469275,\n \"acc_norm_stderr\": 0.014508979453553969\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.20915032679738563,\n \"acc_stderr\": 0.023287685312334806,\n \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.023287685312334806\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.2282958199356913,\n \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262196,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2685788787483703,\n \"acc_stderr\": 0.011320056629121727,\n \"acc_norm\": 0.2685788787483703,\n \"acc_norm_stderr\": 0.011320056629121727\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.025767252010855956,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.025767252010855956\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.02540930195322568,\n \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.02540930195322568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n \"acc_stderr\": 0.029475250236017204,\n \"acc_norm\": 0.22388059701492538,\n \"acc_norm_stderr\": 0.029475250236017204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n \"acc_stderr\": 0.03711725190740749,\n \"acc_norm\": 0.3493975903614458,\n \"acc_norm_stderr\": 0.03711725190740749\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987251,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987251\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": 0.4855571913341874,\n \"mc2_stderr\": 0.01696461298096417\n }\n}\n```", "repo_url": "https://huggingface.co/MayaPH/GodziLLa-30B-plus", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|arc:challenge|25_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|arc:challenge|25_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hellaswag|10_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hellaswag|10_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-28T12:09:48.036825.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T15:59:28.566104.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T15:59:28.566104.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_28T12_09_48.036825", "path": ["results_2023-07-28T12:09:48.036825.parquet"]}, {"split": "2023_08_02T15_59_28.566104", "path": ["results_2023-08-02T15:59:28.566104.parquet"]}, {"split": "latest", "path": ["results_2023-08-02T15:59:28.566104.parquet"]}]}]}
|
2023-08-27T11:25:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B-plus
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model MayaPH/GodziLLa-30B-plus on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-02T15:59:28.566104 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B-plus",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa-30B-plus on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-02T15:59:28.566104 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B-plus",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa-30B-plus on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-02T15:59:28.566104 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B-plus## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa-30B-plus on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-02T15:59:28.566104 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
6aa86db9e159a538e6356de4b5abf1c41d87ef13
|
# Dataset Card for Evaluation run of MayaPH/GodziLLa2-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MayaPH/GodziLLa2-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [MayaPH/GodziLLa2-70B](https://huggingface.co/MayaPH/GodziLLa2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MayaPH__GodziLLa2-70B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T19:39:50.850432](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__GodziLLa2-70B_public/blob/main/results_2023-11-08T19-39-50.850432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.40918624161073824,
"em_stderr": 0.0050353012998842275,
"f1": 0.523052642617452,
"f1_stderr": 0.004562583016028929,
"acc": 0.6320159552601676,
"acc_stderr": 0.01207770454600458
},
"harness|drop|3": {
"em": 0.40918624161073824,
"em_stderr": 0.0050353012998842275,
"f1": 0.523052642617452,
"f1_stderr": 0.004562583016028929
},
"harness|gsm8k|5": {
"acc": 0.43214556482183475,
"acc_stderr": 0.013645072137842443
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166718
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_MayaPH__GodziLLa2-70B
|
[
"region:us"
] |
2023-08-17T22:55:34+00:00
|
{"pretty_name": "Evaluation run of MayaPH/GodziLLa2-70B", "dataset_summary": "Dataset automatically created during the evaluation run of model [MayaPH/GodziLLa2-70B](https://huggingface.co/MayaPH/GodziLLa2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__GodziLLa2-70B_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-08T19:39:50.850432](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__GodziLLa2-70B_public/blob/main/results_2023-11-08T19-39-50.850432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.40918624161073824,\n \"em_stderr\": 0.0050353012998842275,\n \"f1\": 0.523052642617452,\n \"f1_stderr\": 0.004562583016028929,\n \"acc\": 0.6320159552601676,\n \"acc_stderr\": 0.01207770454600458\n },\n \"harness|drop|3\": {\n \"em\": 0.40918624161073824,\n \"em_stderr\": 0.0050353012998842275,\n \"f1\": 0.523052642617452,\n \"f1_stderr\": 0.004562583016028929\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43214556482183475,\n \"acc_stderr\": 0.013645072137842443\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166718\n }\n}\n```", "repo_url": "https://huggingface.co/MayaPH/GodziLLa2-70B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_08T19_39_50.850432", "path": ["**/details_harness|drop|3_2023-11-08T19-39-50.850432.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-08T19-39-50.850432.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_08T19_39_50.850432", "path": ["**/details_harness|gsm8k|5_2023-11-08T19-39-50.850432.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-08T19-39-50.850432.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_08T19_39_50.850432", "path": ["**/details_harness|winogrande|5_2023-11-08T19-39-50.850432.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-08T19-39-50.850432.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_08T19_39_50.850432", "path": ["results_2023-11-08T19-39-50.850432.parquet"]}, {"split": "latest", "path": ["results_2023-11-08T19-39-50.850432.parquet"]}]}]}
|
2023-12-01T14:52:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of MayaPH/GodziLLa2-70B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model MayaPH/GodziLLa2-70B on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-08T19:39:50.850432(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of MayaPH/GodziLLa2-70B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-08T19:39:50.850432(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MayaPH/GodziLLa2-70B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-08T19:39:50.850432(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MayaPH/GodziLLa2-70B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MayaPH/GodziLLa2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-08T19:39:50.850432(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.