sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
28526c9cdaa5948cf395b1df795d505e3422d614
|
# Dataset of ogata_chieri/緒方智絵里/오가타치에리 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ogata_chieri/緒方智絵里/오가타치에리 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_hair, twintails, brown_eyes, bangs, sidelocks, ribbon, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 668.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ogata_chieri_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 379.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ogata_chieri_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1198 | 828.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ogata_chieri_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 590.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ogata_chieri_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1198 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ogata_chieri_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ogata_chieri_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, smile, blush, looking_at_viewer, pink_bikini, collarbone, frilled_bikini, navel, medium_breasts, open_mouth, cleavage, outdoors, blue_sky, cloud, day, hair_ornament |
| 1 | 27 |  |  |  |  |  | 1girl, solo, smile, blush, dress, looking_at_viewer, open_mouth, four-leaf_clover |
| 2 | 5 |  |  |  |  |  | 1girl, blush, dress, looking_at_viewer, solo, detached_sleeves, hair_ornament, hairband, heart_pillow, pillow_hug, open_mouth, smile, thighhighs, apron, bare_shoulders, four-leaf_clover, frills, mary_janes, socks |
| 3 | 16 |  |  |  |  |  | hair_ribbon, 1girl, blush, looking_at_viewer, solo, long_sleeves, red_ribbon, school_uniform, simple_background, red_bowtie, white_background, white_shirt, open_mouth, upper_body, collared_shirt, pink_sweater, plaid_skirt, :d, pleated_skirt |
| 4 | 7 |  |  |  |  |  | 1girl, blush, long_hair, looking_at_viewer, solo, white_headwear, beret, short_over_long_sleeves, white_background, fur_trim, hands_up, pink_bow, pom_pom_(clothes), smile, closed_mouth, pink_ribbon, puffy_short_sleeves, white_dress, boots, brown_footwear, center_frills, frilled_dress, heart, pink_dress, simple_background, sleeves_past_wrists, upper_body |
| 5 | 6 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, blush, looking_at_viewer, simple_background, white_background, collarbone, covered_navel, long_hair, small_breasts, solo, bare_shoulders, open_mouth, old_school_swimsuit |
| 6 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, sitting, solo, nipples, nude, onsen, smile, medium_breasts, small_breasts, wet, clover_hair_ornament, collarbone, holding_towel, leaf, long_hair, night, open_mouth, steam |
| 7 | 12 |  |  |  |  |  | blush, 1girl, nipples, hetero, small_breasts, 1boy, open_mouth, penis, solo_focus, sex, mosaic_censoring, navel, pussy, completely_nude, looking_at_viewer, on_back, spread_legs, vaginal |
| 8 | 6 |  |  |  |  |  | black_dress, blush, frilled_apron, looking_at_viewer, maid_apron, maid_headdress, white_apron, 1girl, puffy_short_sleeves, bow, collared_dress, enmaided, neck_ribbon, open_mouth, brown_dress, frilled_dress, holding, long_hair, red_ribbon, small_breasts, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | blush | looking_at_viewer | pink_bikini | collarbone | frilled_bikini | navel | medium_breasts | open_mouth | cleavage | outdoors | blue_sky | cloud | day | hair_ornament | dress | four-leaf_clover | detached_sleeves | hairband | heart_pillow | pillow_hug | thighhighs | apron | bare_shoulders | frills | mary_janes | socks | hair_ribbon | long_sleeves | red_ribbon | school_uniform | simple_background | red_bowtie | white_background | white_shirt | upper_body | collared_shirt | pink_sweater | plaid_skirt | :d | pleated_skirt | long_hair | white_headwear | beret | short_over_long_sleeves | fur_trim | hands_up | pink_bow | pom_pom_(clothes) | closed_mouth | pink_ribbon | puffy_short_sleeves | white_dress | boots | brown_footwear | center_frills | frilled_dress | heart | pink_dress | sleeves_past_wrists | blue_one-piece_swimsuit | covered_navel | small_breasts | old_school_swimsuit | sitting | nipples | nude | onsen | wet | clover_hair_ornament | holding_towel | leaf | night | steam | hetero | 1boy | penis | solo_focus | sex | mosaic_censoring | pussy | completely_nude | on_back | spread_legs | vaginal | black_dress | frilled_apron | maid_apron | maid_headdress | white_apron | bow | collared_dress | enmaided | neck_ribbon | brown_dress | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:--------------------|:--------------|:-------------|:-----------------|:--------|:-----------------|:-------------|:-----------|:-----------|:-----------|:--------|:------|:----------------|:--------|:-------------------|:-------------------|:-----------|:---------------|:-------------|:-------------|:--------|:-----------------|:---------|:-------------|:--------|:--------------|:---------------|:-------------|:-----------------|:--------------------|:-------------|:-------------------|:--------------|:-------------|:-----------------|:---------------|:--------------|:-----|:----------------|:------------|:-----------------|:--------|:--------------------------|:-----------|:-----------|:-----------|:--------------------|:---------------|:--------------|:----------------------|:--------------|:--------|:-----------------|:----------------|:----------------|:--------|:-------------|:----------------------|:--------------------------|:----------------|:----------------|:----------------------|:----------|:----------|:-------|:--------|:------|:-----------------------|:----------------|:-------|:--------|:--------|:---------|:-------|:--------|:-------------|:------|:-------------------|:--------|:------------------|:----------|:--------------|:----------|:--------------|:----------------|:-------------|:-----------------|:--------------|:------|:-----------------|:-----------|:--------------|:--------------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 27 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | X | X | | X | | | | X | | | | | | | | | | | | | | | X | | | | | | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 12 |  |  |  |  |  | X | | | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/ogata_chieri_idolmastercinderellagirls
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T19:14:10+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T13:07:51+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of ogata\_chieri/緒方智絵里/오가타치에리 (THE iDOLM@STER: Cinderella Girls)
========================================================================
This is the dataset of ogata\_chieri/緒方智絵里/오가타치에리 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are 'brown\_hair, twintails, brown\_eyes, bangs, sidelocks, ribbon, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
5a8cba3a7c4feb9ce66054e9a4396c0b73e09e51
|
# Dataset Card for "reuters_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
ingeniumacademy/reuters_articles
|
[
"region:us"
] |
2023-09-12T19:20:21+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "body", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13792576, "num_examples": 17262}, {"name": "validation", "num_bytes": 1870389, "num_examples": 2158}, {"name": "test", "num_bytes": 1379190, "num_examples": 2158}], "download_size": 10073411, "dataset_size": 17042155}}
|
2023-09-12T21:14:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "reuters_articles"
More Information needed
|
[
"# Dataset Card for \"reuters_articles\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"reuters_articles\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"reuters_articles\"\n\nMore Information needed"
] |
9a0005c49bf1aab9cd95186b6dba401108df21c6
|
# Dataset Card for "Kosice_training"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
MilanHrab/Kosice_training
|
[
"region:us"
] |
2023-09-12T19:39:38+00:00
|
{"dataset_info": {"features": [{"name": "name_of_record", "dtype": "string"}, {"name": "speech_array", "sequence": "float64"}, {"name": "sampling_rate", "dtype": "int64"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1178840561.6, "num_examples": 4480}], "download_size": 894629427, "dataset_size": 1178840561.6}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-12T19:47:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "Kosice_training"
More Information needed
|
[
"# Dataset Card for \"Kosice_training\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"Kosice_training\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"Kosice_training\"\n\nMore Information needed"
] |
c89115b5ab6787321ef577cd56f2b695359887cc
|
# Dataset Card for "Kosice_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
MilanHrab/Kosice_test
|
[
"region:us"
] |
2023-09-12T19:40:14+00:00
|
{"dataset_info": {"features": [{"name": "name_of_record", "dtype": "string"}, {"name": "speech_array", "sequence": "float64"}, {"name": "sampling_rate", "dtype": "int64"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 294710140.4, "num_examples": 1120}], "download_size": 223895398, "dataset_size": 294710140.4}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-12T19:47:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "Kosice_test"
More Information needed
|
[
"# Dataset Card for \"Kosice_test\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"Kosice_test\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"Kosice_test\"\n\nMore Information needed"
] |
77f922f79c71529e2471823a485a7dd67e439216
|
# Dataset of carnet/カルネ (Pokémon)
This is the dataset of carnet/カルネ (Pokémon), containing 199 images and their tags.
The core tags of this character are `short_hair, blue_eyes, black_hair, breasts, eyelashes, eyeshadow, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 199 | 166.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carnet_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 199 | 107.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carnet_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 401 | 204.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carnet_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 199 | 153.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carnet_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 401 | 268.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carnet_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/carnet_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 36 |  |  |  |  |  | 1girl, looking_at_viewer, smile, necklace, white_coat, long_sleeves, open_coat, solo, makeup, white_shirt, brown_hair, collarbone, closed_mouth, white_choker, white_shorts, pokemon_(creature), hand_up |
| 1 | 5 |  |  |  |  |  | bob_cut, brown_hair, flat_chest, necklace, pokemon_(creature), smile, white_skin, 1girl, closed_mouth, green_hair, hair_over_one_eye, happy, long_sleeves, mega_pokemon, open_coat, red_eyes, shorts, standing, strapless_dress, white_choker, white_coat, white_dress, 2girls, collarbone, hand_up, looking_at_viewer, short_jumpsuit, bare_shoulders, cowboy_shot, full_body, grey_eyes, shiny_hair, shirt, signature, white_gloves |
| 2 | 15 |  |  |  |  |  | 1girl, nipples, navel, nude, solo, pussy, smile, blush, female_pubic_hair, makeup, large_breasts, necklace, grey_hair, mature_female, open_mouth, spread_legs |
| 3 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, open_mouth, sex, penis, solo_focus, blush, medium_breasts, navel, spread_legs, vaginal, choker, makeup, necklace, nude, girl_on_top, looking_at_viewer, open_clothes, straddling, sweat, brown_hair, cum, earrings, pov, pussy_juice, uncensored |
| 4 | 13 |  |  |  |  |  | 1girl, makeup, looking_at_viewer, smile, tiara, earrings, black_gloves, solo, black_dress, closed_mouth, parted_lips, pokemon_(creature), sparkle, brown_hair, gem |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | necklace | white_coat | long_sleeves | open_coat | solo | makeup | white_shirt | brown_hair | collarbone | closed_mouth | white_choker | white_shorts | pokemon_(creature) | hand_up | bob_cut | flat_chest | white_skin | green_hair | hair_over_one_eye | happy | mega_pokemon | red_eyes | shorts | standing | strapless_dress | white_dress | 2girls | short_jumpsuit | bare_shoulders | cowboy_shot | full_body | grey_eyes | shiny_hair | shirt | signature | white_gloves | nipples | navel | nude | pussy | blush | female_pubic_hair | large_breasts | grey_hair | mature_female | open_mouth | spread_legs | 1boy | hetero | sex | penis | solo_focus | medium_breasts | vaginal | choker | girl_on_top | open_clothes | straddling | sweat | cum | earrings | pov | pussy_juice | uncensored | tiara | black_gloves | black_dress | parted_lips | sparkle | gem |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-----------|:-------------|:---------------|:------------|:-------|:---------|:--------------|:-------------|:-------------|:---------------|:---------------|:---------------|:---------------------|:----------|:----------|:-------------|:-------------|:-------------|:--------------------|:--------|:---------------|:-----------|:---------|:-----------|:------------------|:--------------|:---------|:-----------------|:-----------------|:--------------|:------------|:------------|:-------------|:--------|:------------|:---------------|:----------|:--------|:-------|:--------|:--------|:--------------------|:----------------|:------------|:----------------|:-------------|:--------------|:-------|:---------|:------|:--------|:-------------|:-----------------|:----------|:---------|:--------------|:---------------|:-------------|:--------|:------|:-----------|:------|:--------------|:-------------|:--------|:---------------|:--------------|:--------------|:----------|:------|
| 0 | 36 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 4 | 13 |  |  |  |  |  | X | X | X | | | | | X | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X |
|
CyberHarem/carnet_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T19:49:39+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:12:00+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of carnet/カルネ (Pokémon)
===============================
This is the dataset of carnet/カルネ (Pokémon), containing 199 images and their tags.
The core tags of this character are 'short\_hair, blue\_eyes, black\_hair, breasts, eyelashes, eyeshadow, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
97c04865e415fec9b40967824cce30afbf7985ac
|
# Dataset Card for Evaluation run of chargoddard/llama-2-26b-trenchcoat-stack
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/llama-2-26b-trenchcoat-stack
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/llama-2-26b-trenchcoat-stack](https://huggingface.co/chargoddard/llama-2-26b-trenchcoat-stack) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-05T03:20:31.232234](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack_public/blob/main/results_2023-11-05T03-20-31.232234.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.028208892617449664,
"em_stderr": 0.0016955832997069967,
"f1": 0.07960255872483231,
"f1_stderr": 0.0020841586471945246,
"acc": 0.3881222949389441,
"acc_stderr": 0.00840931636658079
},
"harness|drop|3": {
"em": 0.028208892617449664,
"em_stderr": 0.0016955832997069967,
"f1": 0.07960255872483231,
"f1_stderr": 0.0020841586471945246
},
"harness|gsm8k|5": {
"acc": 0.02880970432145565,
"acc_stderr": 0.004607484283767473
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack
|
[
"region:us"
] |
2023-09-12T19:50:50+00:00
|
{"pretty_name": "Evaluation run of chargoddard/llama-2-26b-trenchcoat-stack", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/llama-2-26b-trenchcoat-stack](https://huggingface.co/chargoddard/llama-2-26b-trenchcoat-stack) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-05T03:20:31.232234](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama-2-26b-trenchcoat-stack_public/blob/main/results_2023-11-05T03-20-31.232234.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.028208892617449664,\n \"em_stderr\": 0.0016955832997069967,\n \"f1\": 0.07960255872483231,\n \"f1_stderr\": 0.0020841586471945246,\n \"acc\": 0.3881222949389441,\n \"acc_stderr\": 0.00840931636658079\n },\n \"harness|drop|3\": {\n \"em\": 0.028208892617449664,\n \"em_stderr\": 0.0016955832997069967,\n \"f1\": 0.07960255872483231,\n \"f1_stderr\": 0.0020841586471945246\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02880970432145565,\n \"acc_stderr\": 0.004607484283767473\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/llama-2-26b-trenchcoat-stack", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_05T03_20_31.232234", "path": ["**/details_harness|drop|3_2023-11-05T03-20-31.232234.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-05T03-20-31.232234.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_05T03_20_31.232234", "path": ["**/details_harness|gsm8k|5_2023-11-05T03-20-31.232234.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-05T03-20-31.232234.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_05T03_20_31.232234", "path": ["**/details_harness|winogrande|5_2023-11-05T03-20-31.232234.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-05T03-20-31.232234.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_05T03_20_31.232234", "path": ["results_2023-11-05T03-20-31.232234.parquet"]}, {"split": "latest", "path": ["results_2023-11-05T03-20-31.232234.parquet"]}]}]}
|
2023-12-01T14:15:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/llama-2-26b-trenchcoat-stack
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model chargoddard/llama-2-26b-trenchcoat-stack on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-05T03:20:31.232234(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of chargoddard/llama-2-26b-trenchcoat-stack",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/llama-2-26b-trenchcoat-stack on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-05T03:20:31.232234(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/llama-2-26b-trenchcoat-stack",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/llama-2-26b-trenchcoat-stack on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-05T03:20:31.232234(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chargoddard/llama-2-26b-trenchcoat-stack## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model chargoddard/llama-2-26b-trenchcoat-stack on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-05T03:20:31.232234(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e5cd0d609ee31177a925081373a653e11ef52e99
|
# Dataset Card for "gtzan_all_preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
fmagot01/gtzan_all_preprocessed
|
[
"region:us"
] |
2023-09-12T19:55:58+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "label", "dtype": {"class_label": {"names": {"0": "blues", "1": "classical", "2": "country", "3": "disco", "4": "hiphop", "5": "jazz", "6": "metal", "7": "pop", "8": "reggae", "9": "rock"}}}}, {"name": "input_values", "sequence": "float32"}, {"name": "attention_mask", "sequence": "int32"}], "splits": [{"name": "train", "num_bytes": 3452159816, "num_examples": 899}, {"name": "test", "num_bytes": 384000696, "num_examples": 100}], "download_size": 1923103923, "dataset_size": 3836160512}}
|
2023-09-12T19:57:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "gtzan_all_preprocessed"
More Information needed
|
[
"# Dataset Card for \"gtzan_all_preprocessed\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"gtzan_all_preprocessed\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"gtzan_all_preprocessed\"\n\nMore Information needed"
] |
4fca634d549baf095487072439753a90f9051170
|
## Citation and Contact Information
### Cite
Please cite our paper if you use any code, data, or models.
```c
@inproceedings{shah-etal-2023-trillion,
title = "Trillion Dollar Words: A New Financial Dataset, Task {\&} Market Analysis",
author = "Shah, Agam and
Paturi, Suvan and
Chava, Sudheer",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.368",
doi = "10.18653/v1/2023.acl-long.368",
pages = "6664--6679",
abstract = "Monetary policy pronouncements by Federal Open Market Committee (FOMC) are a major driver of financial market returns. We construct the largest tokenized and annotated dataset of FOMC speeches, meeting minutes, and press conference transcripts in order to understand how monetary policy influences financial markets. In this study, we develop a novel task of hawkish-dovish classification and benchmark various pre-trained language models on the proposed dataset. Using the best-performing model (RoBERTa-large), we construct a measure of monetary policy stance for the FOMC document release days. To evaluate the constructed measure, we study its impact on the treasury market, stock market, and macroeconomic indicators. Our dataset, models, and code are publicly available on Huggingface and GitHub under CC BY-NC 4.0 license.",
}
```
### Contact Information
Please contact Agam Shah (ashah482[at]gatech[dot]edu) for any issues and questions.
GitHub: [@shahagam4](https://github.com/shahagam4)
Website: [https://shahagam4.github.io/](https://shahagam4.github.io/)
|
gtfintechlab/fomc_communication
|
[
"task_categories:text-classification",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-nc-4.0",
"finance",
"region:us"
] |
2023-09-12T20:00:59+00:00
|
{"language": ["en"], "license": "cc-by-nc-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification"], "tags": ["finance"]}
|
2023-09-12T20:18:49+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-text-classification #size_categories-1K<n<10K #language-English #license-cc-by-nc-4.0 #finance #region-us
|
and Contact Information
### Cite
Please cite our paper if you use any code, data, or models.
### Contact Information
Please contact Agam Shah (ashah482[at]gatech[dot]edu) for any issues and questions.
GitHub: @shahagam4
Website: URL
|
[
"### Cite\nPlease cite our paper if you use any code, data, or models.",
"### Contact Information\n\nPlease contact Agam Shah (ashah482[at]gatech[dot]edu) for any issues and questions. \nGitHub: @shahagam4 \nWebsite: URL"
] |
[
"TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-English #license-cc-by-nc-4.0 #finance #region-us \n",
"### Cite\nPlease cite our paper if you use any code, data, or models.",
"### Contact Information\n\nPlease contact Agam Shah (ashah482[at]gatech[dot]edu) for any issues and questions. \nGitHub: @shahagam4 \nWebsite: URL"
] |
[
47,
19,
41
] |
[
"passage: TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-English #license-cc-by-nc-4.0 #finance #region-us \n### Cite\nPlease cite our paper if you use any code, data, or models.### Contact Information\n\nPlease contact Agam Shah (ashah482[at]gatech[dot]edu) for any issues and questions. \nGitHub: @shahagam4 \nWebsite: URL"
] |
560ed791f2e28bef37fbcdc670275aa6badd2b72
|
# Dataset Card for "paper_test_assym_distilbert_results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
nikchar/paper_test_assym_distilbert_results
|
[
"region:us"
] |
2023-09-12T20:12:48+00:00
|
{"dataset_info": {"features": [{"name": "claim", "dtype": "string"}, {"name": "evidence_wiki_url", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "retrieved_evidence_title", "sequence": "string"}, {"name": "retrieved_evidence_text", "sequence": "string"}, {"name": "labels", "dtype": "int64"}, {"name": "Retrieval_Success", "dtype": "bool"}, {"name": "Predicted_Labels", "dtype": "int64"}, {"name": "Predicted_Labels_Each_doc", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 73601741, "num_examples": 11073}], "download_size": 34426513, "dataset_size": 73601741}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-12T20:12:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "paper_test_assym_distilbert_results"
More Information needed
|
[
"# Dataset Card for \"paper_test_assym_distilbert_results\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"paper_test_assym_distilbert_results\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"paper_test_assym_distilbert_results\"\n\nMore Information needed"
] |
f2b64a358132305095e602ce11819d834344d90f
|
# Original Songs Lyrics with French Translation
### Dataset Summary
Dataset of 99289 songs containing their metadata (author, album, release date, song number), original lyrics and lyrics translated into French.
Details of the number of songs by language of origin can be found in the table below:
| Original language | Number of songs |
|---|:---|
| en | 75786 |
| fr | 18486 |
| es | 1743 |
| it | 803 |
| de | 691 |
| sw | 529 |
| ko | 193 |
| id | 169 |
| pt | 142 |
| no | 122 |
| fi | 113 |
| sv | 70 |
| hr | 53 |
| so | 43 |
| ca | 41 |
| tl | 36 |
| ja | 35 |
| nl | 32 |
| ru | 29 |
| et | 27 |
| tr | 22 |
| ro | 19 |
| cy | 14 |
| vi | 14 |
| af | 13 |
| hu | 10 |
| sk | 10 |
| sl | 10 |
| cs | 7 |
| da | 6 |
| pl | 5 |
| sq | 4 |
| el | 4 |
| he | 3 |
| zh-cn | 2 |
| th | 1 |
| bg | 1 |
| ar | 1 |
## Citation
Our work can be cited as:
```bash
@misc{faysse2024croissantllm,
title={CroissantLLM: A Truly Bilingual French-English Language Model},
author={Manuel Faysse and Patrick Fernandes and Nuno Guerreiro and António Loison and Duarte Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro Martins and Antoni Bigata Casademunt and François Yvon and André Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo},
year={2024},
eprint={2402.00786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Nicolas-BZRD/English_French_Songs_Lyrics_Translation_Original
|
[
"task_categories:translation",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:fr",
"language:en",
"language:es",
"language:it",
"language:de",
"language:ko",
"language:id",
"language:pt",
"language:no",
"language:fi",
"language:sv",
"language:sw",
"language:hr",
"language:so",
"language:ca",
"language:tl",
"language:ja",
"language:nl",
"language:ru",
"language:et",
"language:tr",
"language:ro",
"language:cy",
"language:vi",
"language:af",
"language:hu",
"language:sk",
"language:sl",
"language:cs",
"language:da",
"language:pl",
"language:sq",
"language:el",
"language:he",
"language:zh",
"language:th",
"language:bg",
"language:ar",
"license:unknown",
"music",
"parallel",
"parallel data",
"arxiv:2402.00786",
"region:us"
] |
2023-09-12T20:21:44+00:00
|
{"language": ["fr", "en", "es", "it", "de", "ko", "id", "pt", "no", "fi", "sv", "sw", "hr", "so", "ca", "tl", "ja", "nl", "ru", "et", "tr", "ro", "cy", "vi", "af", "hu", "sk", "sl", "cs", "da", "pl", "sq", "el", "he", "zh", "th", "bg", "ar"], "license": "unknown", "size_categories": ["10K<n<100K"], "task_categories": ["translation", "text-generation"], "pretty_name": "SYFT", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "artist_name", "dtype": "string"}, {"name": "album_name", "dtype": "string"}, {"name": "year", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "original_version", "dtype": "string"}, {"name": "french_version", "dtype": "string"}, {"name": "language", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 250317845, "num_examples": 99289}], "download_size": 122323323, "dataset_size": 250317845}, "tags": ["music", "parallel", "parallel data"]}
|
2024-02-08T23:34:15+00:00
|
[
"2402.00786"
] |
[
"fr",
"en",
"es",
"it",
"de",
"ko",
"id",
"pt",
"no",
"fi",
"sv",
"sw",
"hr",
"so",
"ca",
"tl",
"ja",
"nl",
"ru",
"et",
"tr",
"ro",
"cy",
"vi",
"af",
"hu",
"sk",
"sl",
"cs",
"da",
"pl",
"sq",
"el",
"he",
"zh",
"th",
"bg",
"ar"
] |
TAGS
#task_categories-translation #task_categories-text-generation #size_categories-10K<n<100K #language-French #language-English #language-Spanish #language-Italian #language-German #language-Korean #language-Indonesian #language-Portuguese #language-Norwegian #language-Finnish #language-Swedish #language-Swahili (macrolanguage) #language-Croatian #language-Somali #language-Catalan #language-Tagalog #language-Japanese #language-Dutch #language-Russian #language-Estonian #language-Turkish #language-Romanian #language-Welsh #language-Vietnamese #language-Afrikaans #language-Hungarian #language-Slovak #language-Slovenian #language-Czech #language-Danish #language-Polish #language-Albanian #language-Modern Greek (1453-) #language-Hebrew #language-Chinese #language-Thai #language-Bulgarian #language-Arabic #license-unknown #music #parallel #parallel data #arxiv-2402.00786 #region-us
|
Original Songs Lyrics with French Translation
=============================================
### Dataset Summary
Dataset of 99289 songs containing their metadata (author, album, release date, song number), original lyrics and lyrics translated into French.
Details of the number of songs by language of origin can be found in the table below:
Our work can be cited as:
|
[
"### Dataset Summary\n\n\nDataset of 99289 songs containing their metadata (author, album, release date, song number), original lyrics and lyrics translated into French.\n\n\nDetails of the number of songs by language of origin can be found in the table below:\n\n\n\nOur work can be cited as:"
] |
[
"TAGS\n#task_categories-translation #task_categories-text-generation #size_categories-10K<n<100K #language-French #language-English #language-Spanish #language-Italian #language-German #language-Korean #language-Indonesian #language-Portuguese #language-Norwegian #language-Finnish #language-Swedish #language-Swahili (macrolanguage) #language-Croatian #language-Somali #language-Catalan #language-Tagalog #language-Japanese #language-Dutch #language-Russian #language-Estonian #language-Turkish #language-Romanian #language-Welsh #language-Vietnamese #language-Afrikaans #language-Hungarian #language-Slovak #language-Slovenian #language-Czech #language-Danish #language-Polish #language-Albanian #language-Modern Greek (1453-) #language-Hebrew #language-Chinese #language-Thai #language-Bulgarian #language-Arabic #license-unknown #music #parallel #parallel data #arxiv-2402.00786 #region-us \n",
"### Dataset Summary\n\n\nDataset of 99289 songs containing their metadata (author, album, release date, song number), original lyrics and lyrics translated into French.\n\n\nDetails of the number of songs by language of origin can be found in the table below:\n\n\n\nOur work can be cited as:"
] |
[
281,
66
] |
[
"passage: TAGS\n#task_categories-translation #task_categories-text-generation #size_categories-10K<n<100K #language-French #language-English #language-Spanish #language-Italian #language-German #language-Korean #language-Indonesian #language-Portuguese #language-Norwegian #language-Finnish #language-Swedish #language-Swahili (macrolanguage) #language-Croatian #language-Somali #language-Catalan #language-Tagalog #language-Japanese #language-Dutch #language-Russian #language-Estonian #language-Turkish #language-Romanian #language-Welsh #language-Vietnamese #language-Afrikaans #language-Hungarian #language-Slovak #language-Slovenian #language-Czech #language-Danish #language-Polish #language-Albanian #language-Modern Greek (1453-) #language-Hebrew #language-Chinese #language-Thai #language-Bulgarian #language-Arabic #license-unknown #music #parallel #parallel data #arxiv-2402.00786 #region-us \n### Dataset Summary\n\n\nDataset of 99289 songs containing their metadata (author, album, release date, song number), original lyrics and lyrics translated into French.\n\n\nDetails of the number of songs by language of origin can be found in the table below:\n\n\n\nOur work can be cited as:"
] |
59a457fd792911babeff78c9045596b5e87413e5
|
# Dataset Card for Evaluation run of KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged](https://huggingface.co/KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T07:08:34.120359](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged/blob/main/results_2023-10-29T07-08-34.120359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.05134962248322172,
"f1_stderr": 0.0012730168443049574,
"acc": 0.3395923103113801,
"acc_stderr": 0.007914879526646601
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.05134962248322172,
"f1_stderr": 0.0012730168443049574
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775673
},
"harness|winogrande|5": {
"acc": 0.67008681925809,
"acc_stderr": 0.013214432542517527
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged
|
[
"region:us"
] |
2023-09-12T20:28:48+00:00
|
{"pretty_name": "Evaluation run of KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged](https://huggingface.co/KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T07:08:34.120359](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__openllama_3b_EvolInstruct_lora_merged/blob/main/results_2023-10-29T07-08-34.120359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.05134962248322172,\n \"f1_stderr\": 0.0012730168443049574,\n \"acc\": 0.3395923103113801,\n \"acc_stderr\": 0.007914879526646601\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.05134962248322172,\n \"f1_stderr\": 0.0012730168443049574\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.002615326510775673\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.67008681925809,\n \"acc_stderr\": 0.013214432542517527\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|arc:challenge|25_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T07_08_34.120359", "path": ["**/details_harness|drop|3_2023-10-29T07-08-34.120359.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T07-08-34.120359.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T07_08_34.120359", "path": ["**/details_harness|gsm8k|5_2023-10-29T07-08-34.120359.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T07-08-34.120359.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hellaswag|10_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T21-28-35.383540.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T21-28-35.383540.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T21-28-35.383540.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T07_08_34.120359", "path": ["**/details_harness|winogrande|5_2023-10-29T07-08-34.120359.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T07-08-34.120359.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T21_28_35.383540", "path": ["results_2023-09-12T21-28-35.383540.parquet"]}, {"split": "2023_10_29T07_08_34.120359", "path": ["results_2023-10-29T07-08-34.120359.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T07-08-34.120359.parquet"]}]}]}
|
2023-10-29T07:08:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-29T07:08:34.120359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T07:08:34.120359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T07:08:34.120359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
32,
31,
180,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/openllama_3b_EvolInstruct_lora_merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T07:08:34.120359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3856ca97e4a5c31787d3357b2b54c3f360d60404
|
# Dataset Card for Evaluation run of grimpep/MythoMax-L2-33b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/grimpep/MythoMax-L2-33b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [grimpep/MythoMax-L2-33b](https://huggingface.co/grimpep/MythoMax-L2-33b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_grimpep__MythoMax-L2-33b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T21:46:34.528264](https://huggingface.co/datasets/open-llm-leaderboard/details_grimpep__MythoMax-L2-33b/blob/main/results_2023-09-12T21-46-34.528264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.510197997446698,
"acc_stderr": 0.034698207813013165,
"acc_norm": 0.5143638265847267,
"acc_norm_stderr": 0.03468158572078421,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.524808738389582,
"mc2_stderr": 0.015873078551875083
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636583,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650649
},
"harness|hellaswag|10": {
"acc": 0.5786695877315275,
"acc_stderr": 0.004927631806477559,
"acc_norm": 0.7911770563632743,
"acc_norm_stderr": 0.0040563690969549395
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.03065674869673943,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.03065674869673943
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923183,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923183
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.02357760479165581,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.02357760479165581
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5709677419354838,
"acc_stderr": 0.028156036538233193,
"acc_norm": 0.5709677419354838,
"acc_norm_stderr": 0.028156036538233193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6313131313131313,
"acc_stderr": 0.034373055019806184,
"acc_norm": 0.6313131313131313,
"acc_norm_stderr": 0.034373055019806184
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.03119584087770029,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.03119584087770029
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6697247706422018,
"acc_stderr": 0.020164466336342977,
"acc_norm": 0.6697247706422018,
"acc_norm_stderr": 0.020164466336342977
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.031415546294025445,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.031415546294025445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.0299366963871386,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.0299366963871386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097172,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097172
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6934865900383141,
"acc_stderr": 0.016486952893041508,
"acc_norm": 0.6934865900383141,
"acc_norm_stderr": 0.016486952893041508
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.026874085883518348,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.026874085883518348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2324022346368715,
"acc_stderr": 0.014125968754673384,
"acc_norm": 0.2324022346368715,
"acc_norm_stderr": 0.014125968754673384
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662737,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704732,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704732
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39113428943937417,
"acc_stderr": 0.012463861839982064,
"acc_norm": 0.39113428943937417,
"acc_norm_stderr": 0.012463861839982064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5212418300653595,
"acc_stderr": 0.020209572388600244,
"acc_norm": 0.5212418300653595,
"acc_norm_stderr": 0.020209572388600244
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.016586304901762564,
"mc2": 0.524808738389582,
"mc2_stderr": 0.015873078551875083
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_grimpep__MythoMax-L2-33b
|
[
"region:us"
] |
2023-09-12T20:46:50+00:00
|
{"pretty_name": "Evaluation run of grimpep/MythoMax-L2-33b", "dataset_summary": "Dataset automatically created during the evaluation run of model [grimpep/MythoMax-L2-33b](https://huggingface.co/grimpep/MythoMax-L2-33b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_grimpep__MythoMax-L2-33b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-12T21:46:34.528264](https://huggingface.co/datasets/open-llm-leaderboard/details_grimpep__MythoMax-L2-33b/blob/main/results_2023-09-12T21-46-34.528264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.510197997446698,\n \"acc_stderr\": 0.034698207813013165,\n \"acc_norm\": 0.5143638265847267,\n \"acc_norm_stderr\": 0.03468158572078421,\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.524808738389582,\n \"mc2_stderr\": 0.015873078551875083\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636583,\n \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650649\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5786695877315275,\n \"acc_stderr\": 0.004927631806477559,\n \"acc_norm\": 0.7911770563632743,\n \"acc_norm_stderr\": 0.0040563690969549395\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.03065674869673943,\n \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.03065674869673943\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.04174752578923183,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.04174752578923183\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5709677419354838,\n \"acc_stderr\": 0.028156036538233193,\n \"acc_norm\": 0.5709677419354838,\n \"acc_norm_stderr\": 0.028156036538233193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6313131313131313,\n \"acc_stderr\": 0.034373055019806184,\n \"acc_norm\": 0.6313131313131313,\n \"acc_norm_stderr\": 0.034373055019806184\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.03119584087770029,\n \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.03119584087770029\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017848,\n \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017848\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6697247706422018,\n \"acc_stderr\": 0.020164466336342977,\n \"acc_norm\": 0.6697247706422018,\n \"acc_norm_stderr\": 0.020164466336342977\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025445,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025445\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.031321798030832904,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.031321798030832904\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6962025316455697,\n \"acc_stderr\": 0.0299366963871386,\n \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.0299366963871386\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097172,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097172\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6934865900383141,\n \"acc_stderr\": 0.016486952893041508,\n \"acc_norm\": 0.6934865900383141,\n \"acc_norm_stderr\": 0.016486952893041508\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.026874085883518348,\n \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.026874085883518348\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2324022346368715,\n \"acc_stderr\": 0.014125968754673384,\n \"acc_norm\": 0.2324022346368715,\n \"acc_norm_stderr\": 0.014125968754673384\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662737,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704732,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704732\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39113428943937417,\n \"acc_stderr\": 0.012463861839982064,\n \"acc_norm\": 0.39113428943937417,\n \"acc_norm_stderr\": 0.012463861839982064\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5212418300653595,\n \"acc_stderr\": 0.020209572388600244,\n \"acc_norm\": 0.5212418300653595,\n \"acc_norm_stderr\": 0.020209572388600244\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n \"mc1_stderr\": 0.016586304901762564,\n \"mc2\": 0.524808738389582,\n \"mc2_stderr\": 0.015873078551875083\n }\n}\n```", "repo_url": "https://huggingface.co/grimpep/MythoMax-L2-33b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|arc:challenge|25_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hellaswag|10_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T21-46-34.528264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T21-46-34.528264.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T21_46_34.528264", "path": ["results_2023-09-12T21-46-34.528264.parquet"]}, {"split": "latest", "path": ["results_2023-09-12T21-46-34.528264.parquet"]}]}]}
|
2023-09-12T20:47:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of grimpep/MythoMax-L2-33b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model grimpep/MythoMax-L2-33b on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-12T21:46:34.528264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of grimpep/MythoMax-L2-33b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model grimpep/MythoMax-L2-33b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-12T21:46:34.528264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of grimpep/MythoMax-L2-33b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model grimpep/MythoMax-L2-33b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-12T21:46:34.528264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of grimpep/MythoMax-L2-33b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model grimpep/MythoMax-L2-33b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-12T21:46:34.528264(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0d967065e41083915810d792a15bb801c9436cd1
|
# Dataset Card for "paper_test_assym_roberta_results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
nikchar/paper_test_assym_roberta_results
|
[
"region:us"
] |
2023-09-12T20:49:22+00:00
|
{"dataset_info": {"features": [{"name": "claim", "dtype": "string"}, {"name": "evidence_wiki_url", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "retrieved_evidence_title", "sequence": "string"}, {"name": "retrieved_evidence_text", "sequence": "string"}, {"name": "labels", "dtype": "int64"}, {"name": "Retrieval_Success", "dtype": "bool"}, {"name": "Predicted_Labels", "dtype": "int64"}, {"name": "Predicted_Labels_Each_doc", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 73601741, "num_examples": 11073}], "download_size": 34426502, "dataset_size": 73601741}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-12T20:49:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "paper_test_assym_roberta_results"
More Information needed
|
[
"# Dataset Card for \"paper_test_assym_roberta_results\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"paper_test_assym_roberta_results\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"paper_test_assym_roberta_results\"\n\nMore Information needed"
] |
2f14bc328a2df981925f1b954efc8fdbc9518147
|
# Dataset of aether_foundation_employee/エーテル財団職員 (Pokémon)
This is the dataset of aether_foundation_employee/エーテル財団職員 (Pokémon), containing 185 images and their tags.
The core tags of this character are `dark_skin, short_hair, black_hair, dark-skinned_female, hat, breasts, white_headwear, cabbie_hat, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 185 | 183.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aether_foundation_employee_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 185 | 110.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aether_foundation_employee_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 478 | 240.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aether_foundation_employee_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 185 | 163.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aether_foundation_employee_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 478 | 319.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aether_foundation_employee_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aether_foundation_employee_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, sex, sweat, vaginal, white_gloves, nude, open_mouth, girl_on_top, navel, nipples, cum_in_pussy, smile, solo_focus, bar_censor, large_breasts, spread_legs, squatting_cowgirl_position, thighhighs |
| 1 | 9 |  |  |  |  |  | 1boy, 1girl, clothed_female_nude_male, hetero, penis, short_sleeves, testicles, white_gloves, gloved_handjob, blush, cum, dark-skinned_male, interracial, mosaic_censoring, open_mouth, bangs, pouch, sweat |
| 2 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, solo_focus, white_gloves, short_sleeves, censored, open_mouth, tongue_out, fellatio, cum_in_mouth, facial, large_breasts, nude, simple_background, white_background |
| 3 | 9 |  |  |  |  |  | 1girl, short_sleeves, white_gloves, shoes, white_footwear, simple_background, white_background, looking_at_viewer, open_mouth, pokemon_(creature), white_pantyhose, full_body, pouch, short_jumpsuit, thigh_strap |
| 4 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_gloves, holding_poke_ball, poke_ball_(basic), short_sleeves, blush, closed_mouth, pantyhose, white_background, bangs, hand_on_hip, simple_background, smile, uniform |
| 5 | 5 |  |  |  |  |  | 1girl, simple_background, solo, white_background, white_bikini, gigantic_breasts, looking_at_viewer, smile, huge_breasts, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | blush | hetero | penis | sex | sweat | vaginal | white_gloves | nude | open_mouth | girl_on_top | navel | nipples | cum_in_pussy | smile | solo_focus | bar_censor | large_breasts | spread_legs | squatting_cowgirl_position | thighhighs | clothed_female_nude_male | short_sleeves | testicles | gloved_handjob | cum | dark-skinned_male | interracial | mosaic_censoring | bangs | pouch | censored | tongue_out | fellatio | cum_in_mouth | facial | simple_background | white_background | shoes | white_footwear | looking_at_viewer | pokemon_(creature) | white_pantyhose | full_body | short_jumpsuit | thigh_strap | solo | holding_poke_ball | poke_ball_(basic) | closed_mouth | pantyhose | hand_on_hip | uniform | white_bikini | gigantic_breasts | huge_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:--------|:---------|:--------|:------|:--------|:----------|:---------------|:-------|:-------------|:--------------|:--------|:----------|:---------------|:--------|:-------------|:-------------|:----------------|:--------------|:-----------------------------|:-------------|:---------------------------|:----------------|:------------|:-----------------|:------|:--------------------|:--------------|:-------------------|:--------|:--------|:-----------|:-------------|:-----------|:---------------|:---------|:--------------------|:-------------------|:--------|:-----------------|:--------------------|:---------------------|:------------------|:------------|:-----------------|:--------------|:-------|:--------------------|:--------------------|:---------------|:------------|:--------------|:----------|:---------------|:-------------------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | | | | X | X | X | | | | | | X | | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | | X | | | | | | | X | | X | | | | | | | | | | | | | X | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | | X | X | | | | | | X | | | | | | | X | | | | | | | | X | | | | | | | X | | | | | | | X | X | | | X | | | | | | X | X | X | X | X | X | X | | | |
| 5 | 5 |  |  |  |  |  | | X | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | | | | | X | | | | | | | X | X | X |
|
CyberHarem/aether_foundation_employee_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T21:01:16+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:05:23+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of aether\_foundation\_employee/エーテル財団職員 (Pokémon)
==========================================================
This is the dataset of aether\_foundation\_employee/エーテル財団職員 (Pokémon), containing 185 images and their tags.
The core tags of this character are 'dark\_skin, short\_hair, black\_hair, dark-skinned\_female, hat, breasts, white\_headwear, cabbie\_hat, brown\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
f9862b7ff6b4f187d045dd35ed1e112fd951894e
|
# Dataset Card for "test-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Kranajan/test-llama2-1k
|
[
"region:us"
] |
2023-09-12T21:08:10+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 104225, "num_examples": 284}], "download_size": 55095, "dataset_size": 104225}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-12T21:08:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "test-llama2-1k"
More Information needed
|
[
"# Dataset Card for \"test-llama2-1k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"test-llama2-1k\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"test-llama2-1k\"\n\nMore Information needed"
] |
5ce9ff4dff664992471b50882a32de2b2712898b
|
# Dataset of sakuma_mayu/佐久間まゆ/사쿠마마유 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of sakuma_mayu/佐久間まゆ/사쿠마마유 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_hair, blue_eyes, hairband, ribbon, breasts, short_hair, bangs, earrings, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 713.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakuma_mayu_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 409.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakuma_mayu_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1203 | 870.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakuma_mayu_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 634.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakuma_mayu_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1203 | 1.22 GiB | [Download](https://huggingface.co/datasets/CyberHarem/sakuma_mayu_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sakuma_mayu_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, blush, dress, open_mouth, heart, bare_shoulders, choker, medium_breasts |
| 1 | 11 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, dress, heart-shaped_pupils, jewelry |
| 2 | 13 |  |  |  |  |  | 1girl, hair_flower, looking_at_viewer, red_dress, solo, jewelry, blush, rose, thorns, bare_shoulders, smile, black_gloves |
| 3 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, maid_headdress, medium_breasts, solo, wrist_cuffs, cleavage, heart, puffy_short_sleeves, red_ribbon, open_mouth, jewelry, waist_apron, frilled_apron, pink_dress, simple_background, white_background, :d, garter_straps, ribbon_trim, white_thighhighs |
| 4 | 6 |  |  |  |  |  | 1girl, blush, brown_dress, heart_earrings, looking_at_viewer, solo, beret, brown_sweater, heart_necklace, medium_breasts, smile, turtleneck_sweater, white_gloves, bare_shoulders, closed_mouth, ribbed_sweater, sleeveless_dress, brown_ribbon, single_glove, upper_body, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, blazer, blush, looking_at_viewer, school_uniform, solo, shirt, smile, bowtie, open_mouth, plaid_skirt, red_bow, jewelry, long_sleeves, pleated_skirt, white_background |
| 6 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, medium_breasts, pink_bikini, solo, blush, necklace, smile, cleavage, frilled_bikini, barefoot, bikini_skirt, heart, navel, open_mouth |
| 7 | 23 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, blush, penis, open_mouth, medium_breasts, nipples, bar_censor, cum, smile, heart-shaped_pupils, looking_at_viewer, nude, pussy, sweat, navel, sex, girl_on_top, hair_ribbon, vaginal |
| 8 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, obi, blush, floral_print, pink_kimono, solo, wide_sleeves, closed_mouth, holding_umbrella, long_sleeves, oil-paper_umbrella, print_kimono, upper_body, :d, hair_bow, hair_flower, jewelry, open_mouth, pink_bow, single_hair_bun, vertical_stripes, white_gloves |
| 9 | 7 |  |  |  |  |  | looking_at_viewer, 1girl, angel_wings, bowtie, plaid, sleeveless_shirt, solo, blue_bow, blush, crop_top, feathered_wings, hair_bow, hair_ribbon, midriff, pleated_skirt, smile, halo, navel, simple_background, white_wings, wrist_cuffs, collarbone, cowboy_shot, frilled_shirt, low_twintails, miniskirt, open_mouth, pink_skirt, white_background, white_sailor_collar |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | blush | dress | open_mouth | heart | bare_shoulders | choker | medium_breasts | heart-shaped_pupils | jewelry | hair_flower | red_dress | rose | thorns | black_gloves | maid_headdress | wrist_cuffs | cleavage | puffy_short_sleeves | red_ribbon | waist_apron | frilled_apron | pink_dress | simple_background | white_background | :d | garter_straps | ribbon_trim | white_thighhighs | brown_dress | heart_earrings | beret | brown_sweater | heart_necklace | turtleneck_sweater | white_gloves | closed_mouth | ribbed_sweater | sleeveless_dress | brown_ribbon | single_glove | upper_body | blazer | school_uniform | shirt | bowtie | plaid_skirt | red_bow | long_sleeves | pleated_skirt | pink_bikini | necklace | frilled_bikini | barefoot | bikini_skirt | navel | 1boy | hetero | solo_focus | penis | nipples | bar_censor | cum | nude | pussy | sweat | sex | girl_on_top | hair_ribbon | vaginal | obi | floral_print | pink_kimono | wide_sleeves | holding_umbrella | oil-paper_umbrella | print_kimono | hair_bow | pink_bow | single_hair_bun | vertical_stripes | angel_wings | plaid | sleeveless_shirt | blue_bow | crop_top | feathered_wings | midriff | halo | white_wings | collarbone | cowboy_shot | frilled_shirt | low_twintails | miniskirt | pink_skirt | white_sailor_collar |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:--------|:--------|:-------------|:--------|:-----------------|:---------|:-----------------|:----------------------|:----------|:--------------|:------------|:-------|:---------|:---------------|:-----------------|:--------------|:-----------|:----------------------|:-------------|:--------------|:----------------|:-------------|:--------------------|:-------------------|:-----|:----------------|:--------------|:-------------------|:--------------|:-----------------|:--------|:----------------|:-----------------|:---------------------|:---------------|:---------------|:-----------------|:-------------------|:---------------|:---------------|:-------------|:---------|:-----------------|:--------|:---------|:--------------|:----------|:---------------|:----------------|:--------------|:-----------|:-----------------|:-----------|:---------------|:--------|:-------|:---------|:-------------|:--------|:----------|:-------------|:------|:-------|:--------|:--------|:------|:--------------|:--------------|:----------|:------|:---------------|:--------------|:---------------|:-------------------|:---------------------|:---------------|:-----------|:-----------|:------------------|:-------------------|:--------------|:--------|:-------------------|:-----------|:-----------|:------------------|:----------|:-------|:--------------|:-------------|:--------------|:----------------|:----------------|:------------|:-------------|:----------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | X | X | X | | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | X | X | | X | X | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | X | | | | X | | X | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 23 |  |  |  |  |  | X | X | X | | X | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | X | | X | X | | X | | | | | | X | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | | | | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/sakuma_mayu_idolmastercinderellagirls
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T21:09:35+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T11:05:38+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of sakuma\_mayu/佐久間まゆ/사쿠마마유 (THE iDOLM@STER: Cinderella Girls)
======================================================================
This is the dataset of sakuma\_mayu/佐久間まゆ/사쿠마마유 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are 'brown\_hair, blue\_eyes, hairband, ribbon, breasts, short\_hair, bangs, earrings, bow', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
284be38581d74c69c93ec066b940077e95af7254
|
# Dataset of sumomo (Pokémon)
This is the dataset of sumomo (Pokémon), containing 147 images and their tags.
The core tags of this character are `short_hair, pink_hair, pink_eyes, bandaid_on_face`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 147 | 80.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sumomo_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 147 | 60.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sumomo_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 235 | 98.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sumomo_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 147 | 76.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sumomo_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 235 | 118.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sumomo_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sumomo_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, barefoot, feet, fingerless_gloves, kicking, simple_background, white_background, bandaid_on_nose, open_mouth, soles, toes, clenched_hands, pokemon_(creature), bandaid_on_arm, blue_gloves, solo, white_pants |
| 1 | 9 |  |  |  |  |  | 1girl, fingerless_gloves, solo, bandaid_on_nose, open_mouth, pokemon_(creature) |
| 2 | 8 |  |  |  |  |  | 1girl, hetero, nipples, open_mouth, penis, pokemon_(creature), pokephilia, bandaid_on_nose, vaginal, 1boy, nude, barefoot, bestiality, blush, fingerless_gloves, heart, small_breasts, solo_focus, tongue, uncensored, bodysuit, cum_in_pussy, feet, interspecies, sex_from_behind, torn_clothes, watermark |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | barefoot | feet | fingerless_gloves | kicking | simple_background | white_background | bandaid_on_nose | open_mouth | soles | toes | clenched_hands | pokemon_(creature) | bandaid_on_arm | blue_gloves | solo | white_pants | hetero | nipples | penis | pokephilia | vaginal | 1boy | nude | bestiality | blush | heart | small_breasts | solo_focus | tongue | uncensored | bodysuit | cum_in_pussy | interspecies | sex_from_behind | torn_clothes | watermark |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:--------------------|:----------|:--------------------|:-------------------|:------------------|:-------------|:--------|:-------|:-----------------|:---------------------|:-----------------|:--------------|:-------|:--------------|:---------|:----------|:--------|:-------------|:----------|:-------|:-------|:-------------|:--------|:--------|:----------------|:-------------|:---------|:-------------|:-----------|:---------------|:---------------|:------------------|:---------------|:------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | | X | | | | X | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | | | | X | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/sumomo_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T21:29:41+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T21:58:28+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of sumomo (Pokémon)
===========================
This is the dataset of sumomo (Pokémon), containing 147 images and their tags.
The core tags of this character are 'short\_hair, pink\_hair, pink\_eyes, bandaid\_on\_face', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
6c03c5451ac74a093773d2c581d0f0f3720bd4d7
|
# Dataset Card for "pubmedsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
ChanceFocus/pubmedsum
|
[
"region:us"
] |
2023-09-12T21:43:15+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "query", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11428, "num_examples": 1}, {"name": "test", "num_bytes": 4144995, "num_examples": 200}], "download_size": 2086997, "dataset_size": 4156423}}
|
2023-09-12T22:48:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pubmedsum"
More Information needed
|
[
"# Dataset Card for \"pubmedsum\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pubmedsum\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pubmedsum\"\n\nMore Information needed"
] |
9a0f6f2103757092d0bc87dd81be1b4858c582bf
|
# Dataset Card for Evaluation run of bongchoi/test-llama-2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bongchoi/test-llama-2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [bongchoi/test-llama-2-7b](https://huggingface.co/bongchoi/test-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bongchoi__test-llama-2-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T23:02:34.518107](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama-2-7b/blob/main/results_2023-09-12T23-02-34.518107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.471008753299703,
"acc_stderr": 0.03528088196519964,
"acc_norm": 0.4749886536723232,
"acc_norm_stderr": 0.035266604173246285,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
},
"harness|arc:challenge|25": {
"acc": 0.4931740614334471,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5884285998805019,
"acc_stderr": 0.0049111251010646425,
"acc_norm": 0.785700059749054,
"acc_norm_stderr": 0.004094971980892084
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187899,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187899
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6311926605504588,
"acc_stderr": 0.020686227560729555,
"acc_norm": 0.6311926605504588,
"acc_norm_stderr": 0.020686227560729555
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.03501038327635897,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.03501038327635897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5153374233128835,
"acc_stderr": 0.03926522378708843,
"acc_norm": 0.5153374233128835,
"acc_norm_stderr": 0.03926522378708843
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173085,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173085
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.017166362471369306,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.017166362471369306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327228,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327228
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3624511082138201,
"acc_stderr": 0.01227751253325248,
"acc_norm": 0.3624511082138201,
"acc_norm_stderr": 0.01227751253325248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4857142857142857,
"acc_stderr": 0.03199615232806287,
"acc_norm": 0.4857142857142857,
"acc_norm_stderr": 0.03199615232806287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652068,
"mc2": 0.3875084099562216,
"mc2_stderr": 0.013510147651392562
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_bongchoi__test-llama-2-7b
|
[
"region:us"
] |
2023-09-12T22:02:47+00:00
|
{"pretty_name": "Evaluation run of bongchoi/test-llama-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [bongchoi/test-llama-2-7b](https://huggingface.co/bongchoi/test-llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bongchoi__test-llama-2-7b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-12T23:02:34.518107](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama-2-7b/blob/main/results_2023-09-12T23-02-34.518107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.471008753299703,\n \"acc_stderr\": 0.03528088196519964,\n \"acc_norm\": 0.4749886536723232,\n \"acc_norm_stderr\": 0.035266604173246285,\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n \"mc2_stderr\": 0.013510147651392562\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4931740614334471,\n \"acc_stderr\": 0.014610029151379813,\n \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5884285998805019,\n \"acc_stderr\": 0.0049111251010646425,\n \"acc_norm\": 0.785700059749054,\n \"acc_norm_stderr\": 0.004094971980892084\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187899,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187899\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6311926605504588,\n \"acc_stderr\": 0.020686227560729555,\n \"acc_norm\": 0.6311926605504588,\n \"acc_norm_stderr\": 0.020686227560729555\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5343137254901961,\n \"acc_stderr\": 0.03501038327635897,\n \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.03501038327635897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5153374233128835,\n \"acc_stderr\": 0.03926522378708843,\n \"acc_norm\": 0.5153374233128835,\n \"acc_norm_stderr\": 0.03926522378708843\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.030236389942173085,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.030236389942173085\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n \"acc_stderr\": 0.017166362471369306,\n \"acc_norm\": 0.6398467432950191,\n \"acc_norm_stderr\": 0.017166362471369306\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327228,\n \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327228\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3624511082138201,\n \"acc_stderr\": 0.01227751253325248,\n \"acc_norm\": 0.3624511082138201,\n \"acc_norm_stderr\": 0.01227751253325248\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4857142857142857,\n \"acc_stderr\": 0.03199615232806287,\n \"acc_norm\": 0.4857142857142857,\n \"acc_norm_stderr\": 0.03199615232806287\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.01512742709652068,\n \"mc2\": 0.3875084099562216,\n \"mc2_stderr\": 0.013510147651392562\n }\n}\n```", "repo_url": "https://huggingface.co/bongchoi/test-llama-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|arc:challenge|25_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hellaswag|10_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T23-02-34.518107.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T23-02-34.518107.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T23_02_34.518107", "path": ["results_2023-09-12T23-02-34.518107.parquet"]}, {"split": "latest", "path": ["results_2023-09-12T23-02-34.518107.parquet"]}]}]}
|
2023-09-12T22:03:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of bongchoi/test-llama-2-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model bongchoi/test-llama-2-7b on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-12T23:02:34.518107(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of bongchoi/test-llama-2-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bongchoi/test-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-12T23:02:34.518107(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bongchoi/test-llama-2-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bongchoi/test-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-12T23:02:34.518107(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bongchoi/test-llama-2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bongchoi/test-llama-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-12T23:02:34.518107(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
666c0e9236a6095a7ace1af3f81ab03e47c6e28e
|
# Dataset of suiren_s_mother (Pokémon)
This is the dataset of suiren_s_mother (Pokémon), containing 500 images and their tags.
The core tags of this character are `blue_hair, breasts, mature_female, blue_eyes, long_hair, freckles, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 520.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suiren_s_mother_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 314.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suiren_s_mother_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1185 | 630.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suiren_s_mother_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 466.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suiren_s_mother_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1185 | 854.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/suiren_s_mother_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/suiren_s_mother_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, cloud, day, outdoors, blue_sky, blush, cleavage, looking_at_viewer, navel, ocean, smile, solo, beach, blue_bikini, water, collarbone, covered_nipples, open_mouth, hair_ornament, huge_breasts, sand, sitting, tongue, white_bikini |
| 1 | 10 |  |  |  |  |  | 1girl, blush, hair_ornament, looking_at_viewer, collarbone, navel, solo, smile, cleavage, micro_bikini, closed_mouth, covered_nipples, open_mouth, sweat |
| 2 | 21 |  |  |  |  |  | nipples, 1girl, looking_at_viewer, navel, solo, blush, smile, collarbone, completely_nude, female_pubic_hair, simple_background, armpits, white_background, hair_ornament |
| 3 | 8 |  |  |  |  |  | 1girl, beach, blush, day, erection, female_pubic_hair, futanari, hair_ornament, large_penis, looking_at_viewer, outdoors, palm_tree, sand, short_sleeves, smile, solo, testicles, uncensored, white_shirt, blue_sky, bracelet, cloud, collarbone, ocean, veiny_penis, closed_mouth, excessive_pubic_hair, watermark, artist_name, bottomless, cleavage, lifted_by_self, lips, navel, no_panties, patreon_username, standing, swept_bangs, thighs, blue_skirt, cowboy_shot, huge_breasts, huge_penis, skirt_lift |
| 4 | 13 |  |  |  |  |  | 1girl, short_sleeves, solo, collarbone, smile, blue_skirt, looking_at_viewer, closed_mouth, blush, bracelet, hair_ornament, hand_up, white_shirt |
| 5 | 6 |  |  |  |  |  | 1boy, 1girl, blush, female_pubic_hair, hetero, navel, nipples, sex, spread_legs, vaginal, collarbone, completely_nude, cum_in_pussy, smile, solo_focus, cheating_(relationship), heart, on_back, bar_censor, closed_mouth, jewelry, open_mouth, sweat, veiny_penis |
| 6 | 6 |  |  |  |  |  | 1boy, 1girl, blush, cheating_(relationship), hetero, nipples, sex_from_behind, sweat, doggystyle, open_mouth, solo_focus, completely_nude, all_fours, teeth |
| 7 | 10 |  |  |  |  |  | 1boy, 1girl, blush, hetero, cheating_(relationship), penis, fellatio, clothed_female_nude_male, male_pubic_hair, shirt, solo_focus, collarbone, mosaic_censoring, short_sleeves, covered_nipples, hair_ornament, heart, ponytail, sweat, tongue_out |
| 8 | 5 |  |  |  |  |  | 1girl, anus, blush, looking_at_viewer, looking_back, ponytail, pussy, solo, huge_breasts, smile, hair_ornament, nipples, uncensored, completely_nude, female_pubic_hair, from_behind, grabbing_own_ass, huge_ass |
| 9 | 5 |  |  |  |  |  | 1girl, cleavage, simple_background, solo, alternate_breast_size, curvy, looking_at_viewer, wide_hips, blue_one-piece_swimsuit, blush, gigantic_breasts, huge_breasts, smile, blue_background, collarbone, covered_navel, covered_nipples, grey_background, open_mouth, ponytail, sweat, thick_thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cloud | day | outdoors | blue_sky | blush | cleavage | looking_at_viewer | navel | ocean | smile | solo | beach | blue_bikini | water | collarbone | covered_nipples | open_mouth | hair_ornament | huge_breasts | sand | sitting | tongue | white_bikini | micro_bikini | closed_mouth | sweat | nipples | completely_nude | female_pubic_hair | simple_background | armpits | white_background | erection | futanari | large_penis | palm_tree | short_sleeves | testicles | uncensored | white_shirt | bracelet | veiny_penis | excessive_pubic_hair | watermark | artist_name | bottomless | lifted_by_self | lips | no_panties | patreon_username | standing | swept_bangs | thighs | blue_skirt | cowboy_shot | huge_penis | skirt_lift | hand_up | 1boy | hetero | sex | spread_legs | vaginal | cum_in_pussy | solo_focus | cheating_(relationship) | heart | on_back | bar_censor | jewelry | sex_from_behind | doggystyle | all_fours | teeth | penis | fellatio | clothed_female_nude_male | male_pubic_hair | shirt | mosaic_censoring | ponytail | tongue_out | anus | looking_back | pussy | from_behind | grabbing_own_ass | huge_ass | alternate_breast_size | curvy | wide_hips | blue_one-piece_swimsuit | gigantic_breasts | blue_background | covered_navel | grey_background | thick_thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:------|:-----------|:-----------|:--------|:-----------|:--------------------|:--------|:--------|:--------|:-------|:--------|:--------------|:--------|:-------------|:------------------|:-------------|:----------------|:---------------|:-------|:----------|:---------|:---------------|:---------------|:---------------|:--------|:----------|:------------------|:--------------------|:--------------------|:----------|:-------------------|:-----------|:-----------|:--------------|:------------|:----------------|:------------|:-------------|:--------------|:-----------|:--------------|:-----------------------|:------------|:--------------|:-------------|:-----------------|:-------|:-------------|:-------------------|:-----------|:--------------|:---------|:-------------|:--------------|:-------------|:-------------|:----------|:-------|:---------|:------|:--------------|:----------|:---------------|:-------------|:--------------------------|:--------|:----------|:-------------|:----------|:------------------|:-------------|:------------|:--------|:--------|:-----------|:---------------------------|:------------------|:--------|:-------------------|:-----------|:-------------|:-------|:---------------|:--------|:--------------|:-------------------|:-----------|:------------------------|:--------|:------------|:--------------------------|:-------------------|:------------------|:----------------|:------------------|:---------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | | | | X | X | X | X | | X | X | | | | X | X | X | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | | | | | X | | X | X | | X | X | | | | X | | | X | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | X | | | X | X | X | | | | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | | | | | X | | X | | | X | X | | | | X | | | X | | | | | | | X | | | | | | | | | | | | X | | | X | X | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | | | X | | | X | | X | | | | | X | | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | | X | | | | | | | | | | | | X | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | | | | X | | | | | | | | | | X | X | | X | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | | | X | | X | | | X | X | | | | | | | X | X | | | | | | | | X | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | | | X | X | X | | | X | X | | | | X | X | X | | X | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X |
|
CyberHarem/suiren_s_mother_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T23:01:09+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T23:07:00+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of suiren\_s\_mother (Pokémon)
======================================
This is the dataset of suiren\_s\_mother (Pokémon), containing 500 images and their tags.
The core tags of this character are 'blue\_hair, breasts, mature\_female, blue\_eyes, long\_hair, freckles, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
d940ece4a25c7a46cef8f5b9488ae39da02fabba
|
This dataset is obtained from a [UniProt search](https://www.uniprot.org/uniprotkb?facets=proteins_with%3A9%2Cannotation_score%3A4&fields=accession%2Cprotein_families%2Cft_binding%2Cft_act_site%2Csequence%2Ccc_similarity&query=%28ft_binding%3A*%29+AND+%28family%3A*%29&view=table)
for protein sequences with family and binding site annotations. The dataset includes unreviewed (TrEMBL) protein sequences as well as
reviewed sequences. We refined the dataset by only including sequences with an annotation score of 4. We sorted and split by family, where
random families were selected for the test dataset until approximately 20% of the protein sequences were separated out for test data.
We excluded any sequences with `<`, `>`, or `?` in the binding site annotations. We furthermore included any active sites that were not
listed as binding sites in the labels (seen in the merged "Binding-Active Sites" column). We split any sequence longer than 1000 residues
into non-overlapping sections of 1000 amino acids or less after the train test split. This results in subsequences of the original protein
sequence that may be too short for consideration, and filtration of the dataset to exclude such subsequences or segment the longer sequences
in a more intelligent way may improve performance. Pickle files containing only the train/test sequences and their binary labels are also available
and can be downloaded for training or validation of the train/test metrics.
|
AmelieSchreiber/binding_sites_random_split_by_family_550K
|
[
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"biology",
"protein sequences",
"binding sites",
"active sites",
"region:us"
] |
2023-09-12T23:04:02+00:00
|
{"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "tags": ["biology", "protein sequences", "binding sites", "active sites"]}
|
2023-09-13T18:39:56+00:00
|
[] |
[
"en"
] |
TAGS
#size_categories-100K<n<1M #language-English #license-mit #biology #protein sequences #binding sites #active sites #region-us
|
This dataset is obtained from a UniProt search
for protein sequences with family and binding site annotations. The dataset includes unreviewed (TrEMBL) protein sequences as well as
reviewed sequences. We refined the dataset by only including sequences with an annotation score of 4. We sorted and split by family, where
random families were selected for the test dataset until approximately 20% of the protein sequences were separated out for test data.
We excluded any sequences with '<', '>', or '?' in the binding site annotations. We furthermore included any active sites that were not
listed as binding sites in the labels (seen in the merged "Binding-Active Sites" column). We split any sequence longer than 1000 residues
into non-overlapping sections of 1000 amino acids or less after the train test split. This results in subsequences of the original protein
sequence that may be too short for consideration, and filtration of the dataset to exclude such subsequences or segment the longer sequences
in a more intelligent way may improve performance. Pickle files containing only the train/test sequences and their binary labels are also available
and can be downloaded for training or validation of the train/test metrics.
|
[] |
[
"TAGS\n#size_categories-100K<n<1M #language-English #license-mit #biology #protein sequences #binding sites #active sites #region-us \n"
] |
[
42
] |
[
"passage: TAGS\n#size_categories-100K<n<1M #language-English #license-mit #biology #protein sequences #binding sites #active sites #region-us \n"
] |
838d2823e1dd1b9f07e6c0d2c798c2464f98b4e5
|
# Dataset of totoki_airi/十時愛梨 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of totoki_airi/十時愛梨 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_hair, brown_eyes, breasts, twintails, large_breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 654.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/totoki_airi_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 375.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/totoki_airi_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1209 | 805.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/totoki_airi_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 580.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/totoki_airi_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1209 | 1.12 GiB | [Download](https://huggingface.co/datasets/CyberHarem/totoki_airi_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/totoki_airi_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, nipples, navel, smile, solo, completely_nude, open_mouth, sweat, female_pubic_hair, hair_ornament, barefoot, breast_hold, heart_censor, pussy, simple_background, white_background |
| 1 | 13 |  |  |  |  |  | 1girl, bikini, cleavage, solo, blush, looking_at_viewer, smile, collarbone, navel, open_mouth, simple_background, sweat, white_background |
| 2 | 22 |  |  |  |  |  | 1girl, blush, heart_necklace, solo, collarbone, looking_at_viewer, cleavage, simple_background, white_background, open_mouth, hair_scrunchie, bare_shoulders, off_shoulder, smile, upper_body, striped_shirt, long_sleeves, sweat |
| 3 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, smile, cleavage, heart_necklace |
| 4 | 7 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, solo, fur_trim, open_mouth, rabbit_ears, :d, bare_shoulders, upper_body, heart_necklace, simple_background, crop_top, fake_animal_ears, hair_ornament, navel, sidelocks, white_background, white_gloves |
| 5 | 8 |  |  |  |  |  | 1girl, blush, solo, looking_at_viewer, black_thighhighs, open_mouth, striped, dress, sitting, :d, necklace, pantyshot, short_hair, skirt |
| 6 | 13 |  |  |  |  |  | 1girl, solo, blush, cleavage, dress, looking_at_viewer, bare_shoulders, open_mouth, white_gloves, :d, heart, tiara, bow, hair_ribbon, jewelry, microphone, frills |
| 7 | 5 |  |  |  |  |  | 1girl, cleavage, detached_collar, frills, maid_headdress, open_mouth, solo, :d, apron, hair_ribbon, looking_at_viewer, puffy_short_sleeves, blush, long_hair, red_dress, upper_body, bowtie, drill_hair, fruit, hairclip, heart, jewelry, plaid, striped |
| 8 | 10 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, penis, solo_focus, navel, sex, vaginal, looking_at_viewer, open_mouth, completely_nude, girl_on_top, long_hair, smile, cowgirl_position, mosaic_censoring, pov, pussy, spread_legs, sweat, cum, hair_ornament, scrunchie |
| 9 | 9 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, paizuri, penis, solo_focus, heart, smile, huge_breasts, nude, one_eye_closed, bar_censor, ejaculation, facial, looking_at_viewer, open_mouth, pov, sweat |
| 10 | 7 |  |  |  |  |  | 1girl, blue_skirt, long_hair, sleeveless_shirt, beret, bracelet, smile, solo, white_shirt, blush, looking_at_viewer, open_mouth, belt, black_headwear, frills, plaid_skirt, collarbone, earrings, heart_necklace, neck_ribbon |
| 11 | 7 |  |  |  |  |  | 1girl, fishnet_pantyhose, playboy_bunny, rabbit_ears, strapless_leotard, bare_shoulders, black_bowtie, blush, detached_collar, fake_animal_ears, looking_at_viewer, red_leotard, solo, simple_background, cleavage, frills, jewelry, long_hair, open_mouth, brown_pantyhose, cowboy_shot, nipples, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | nipples | navel | smile | solo | completely_nude | open_mouth | sweat | female_pubic_hair | hair_ornament | barefoot | breast_hold | heart_censor | pussy | simple_background | white_background | bikini | cleavage | collarbone | heart_necklace | hair_scrunchie | bare_shoulders | off_shoulder | upper_body | striped_shirt | long_sleeves | fur_trim | rabbit_ears | :d | crop_top | fake_animal_ears | sidelocks | white_gloves | black_thighhighs | striped | dress | sitting | necklace | pantyshot | short_hair | skirt | heart | tiara | bow | hair_ribbon | jewelry | microphone | frills | detached_collar | maid_headdress | apron | puffy_short_sleeves | long_hair | red_dress | bowtie | drill_hair | fruit | hairclip | plaid | 1boy | hetero | penis | solo_focus | sex | vaginal | girl_on_top | cowgirl_position | mosaic_censoring | pov | spread_legs | cum | scrunchie | paizuri | huge_breasts | nude | one_eye_closed | bar_censor | ejaculation | facial | blue_skirt | sleeveless_shirt | beret | bracelet | white_shirt | belt | black_headwear | plaid_skirt | earrings | neck_ribbon | fishnet_pantyhose | playboy_bunny | strapless_leotard | black_bowtie | red_leotard | brown_pantyhose | cowboy_shot |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------------------|:----------|:--------|:--------|:-------|:------------------|:-------------|:--------|:--------------------|:----------------|:-----------|:--------------|:---------------|:--------|:--------------------|:-------------------|:---------|:-----------|:-------------|:-----------------|:-----------------|:-----------------|:---------------|:-------------|:----------------|:---------------|:-----------|:--------------|:-----|:-----------|:-------------------|:------------|:---------------|:-------------------|:----------|:--------|:----------|:-----------|:------------|:-------------|:--------|:--------|:--------|:------|:--------------|:----------|:-------------|:---------|:------------------|:-----------------|:--------|:----------------------|:------------|:------------|:---------|:-------------|:--------|:-----------|:--------|:-------|:---------|:--------|:-------------|:------|:----------|:--------------|:-------------------|:-------------------|:------|:--------------|:------|:------------|:----------|:---------------|:-------|:-----------------|:-------------|:--------------|:---------|:-------------|:-------------------|:--------|:-----------|:--------------|:-------|:-----------------|:--------------|:-----------|:--------------|:--------------------|:----------------|:--------------------|:---------------|:--------------|:------------------|:--------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 22 |  |  |  |  |  | X | X | X | | | X | X | | X | X | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | | | X | X | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | | X | | X | | X | | | X | | | | | X | X | | X | | X | | X | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 13 |  |  |  |  |  | X | X | X | | | | X | | X | | | | | | | | | | | X | | | | X | | | | | | | X | | | | X | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | | | | X | | X | | | | | | | | | | | X | | | | | | X | | | | | X | | | | | | X | | | | | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | X | X | | | X | X | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 11 | 7 |  |  |  |  |  | X | X | X | X | | X | X | | X | | | | | | | | X | X | | X | | | | X | | | | | | X | | | X | | | | | | | | | | | | | | | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
CyberHarem/totoki_airi_idolmastercinderellagirls
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T23:20:00+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T16:39:31+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of totoki\_airi/十時愛梨 (THE iDOLM@STER: Cinderella Girls)
===============================================================
This is the dataset of totoki\_airi/十時愛梨 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are 'brown\_hair, brown\_eyes, breasts, twintails, large\_breasts, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
40d58e6f998d981bfa17ed5226192b1be3247292
|
# Dataset of team_skull_underling (Pokémon)
This is the dataset of team_skull_underling (Pokémon), containing 93 images and their tags.
The core tags of this character are `pink_hair, pink_eyes, breasts, hat, dark_skin, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 93 | 64.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/team_skull_underling_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 93 | 46.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/team_skull_underling_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 198 | 84.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/team_skull_underling_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 93 | 61.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/team_skull_underling_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 198 | 102.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/team_skull_underling_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/team_skull_underling_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, bandana, nipples, solo_focus, navel, wristband, hetero, looking_at_viewer, necklace, pussy, 1boy, shirt_lift, tank_top, blush, bottomless, covered_mouth, large_breasts, no_bra, sweat, anus, half-closed_eyes, shiny_skin, sleeveless, spread_legs |
| 1 | 26 |  |  |  |  |  | 1girl, solo, bandana, short_shorts, necklace, tank_top, white_shorts, wristband, looking_at_viewer, makeup, medium_breasts, poke_ball, simple_background, thigh_strap, dark-skinned_female, shoes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bandana | nipples | solo_focus | navel | wristband | hetero | looking_at_viewer | necklace | pussy | 1boy | shirt_lift | tank_top | blush | bottomless | covered_mouth | large_breasts | no_bra | sweat | anus | half-closed_eyes | shiny_skin | sleeveless | spread_legs | solo | short_shorts | white_shorts | makeup | medium_breasts | poke_ball | simple_background | thigh_strap | dark-skinned_female | shoes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:----------|:-------------|:--------|:------------|:---------|:--------------------|:-----------|:--------|:-------|:-------------|:-----------|:--------|:-------------|:----------------|:----------------|:---------|:--------|:-------|:-------------------|:-------------|:-------------|:--------------|:-------|:---------------|:---------------|:---------|:-----------------|:------------|:--------------------|:--------------|:----------------------|:--------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 26 |  |  |  |  |  | X | X | | | | X | | X | X | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/team_skull_underling_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T23:24:52+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T21:50:55+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of team\_skull\_underling (Pokémon)
===========================================
This is the dataset of team\_skull\_underling (Pokémon), containing 93 images and their tags.
The core tags of this character are 'pink\_hair, pink\_eyes, breasts, hat, dark\_skin, long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
1672ff44b348dc948f58fe669a6fdb4e293215e8
|
# Dataset Card for Evaluation run of TheBloke/manticore-13b-chat-pyg-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/manticore-13b-chat-pyg-GPTQ](https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T17:20:44.747146](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ_public/blob/main/results_2023-11-07T17-20-44.747146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006921140939597316,
"em_stderr": 0.0008490247804930618,
"f1": 0.06798238255033592,
"f1_stderr": 0.0015724347441108313,
"acc": 0.4220933440164484,
"acc_stderr": 0.00984688601833749
},
"harness|drop|3": {
"em": 0.006921140939597316,
"em_stderr": 0.0008490247804930618,
"f1": 0.06798238255033592,
"f1_stderr": 0.0015724347441108313
},
"harness|gsm8k|5": {
"acc": 0.08491281273692192,
"acc_stderr": 0.007678212824450799
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224183
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ
|
[
"region:us"
] |
2023-09-12T23:35:19+00:00
|
{"pretty_name": "Evaluation run of TheBloke/manticore-13b-chat-pyg-GPTQ", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/manticore-13b-chat-pyg-GPTQ](https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-07T17:20:44.747146](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__manticore-13b-chat-pyg-GPTQ_public/blob/main/results_2023-11-07T17-20-44.747146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006921140939597316,\n \"em_stderr\": 0.0008490247804930618,\n \"f1\": 0.06798238255033592,\n \"f1_stderr\": 0.0015724347441108313,\n \"acc\": 0.4220933440164484,\n \"acc_stderr\": 0.00984688601833749\n },\n \"harness|drop|3\": {\n \"em\": 0.006921140939597316,\n \"em_stderr\": 0.0008490247804930618,\n \"f1\": 0.06798238255033592,\n \"f1_stderr\": 0.0015724347441108313\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08491281273692192,\n \"acc_stderr\": 0.007678212824450799\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224183\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/manticore-13b-chat-pyg-GPTQ", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_07T17_20_44.747146", "path": ["**/details_harness|drop|3_2023-11-07T17-20-44.747146.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-07T17-20-44.747146.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_07T17_20_44.747146", "path": ["**/details_harness|gsm8k|5_2023-11-07T17-20-44.747146.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-07T17-20-44.747146.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_07T17_20_44.747146", "path": ["**/details_harness|winogrande|5_2023-11-07T17-20-44.747146.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-07T17-20-44.747146.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_07T17_20_44.747146", "path": ["results_2023-11-07T17-20-44.747146.parquet"]}, {"split": "latest", "path": ["results_2023-11-07T17-20-44.747146.parquet"]}]}]}
|
2023-12-01T14:42:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/manticore-13b-chat-pyg-GPTQ
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/manticore-13b-chat-pyg-GPTQ on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-07T17:20:44.747146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/manticore-13b-chat-pyg-GPTQ",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/manticore-13b-chat-pyg-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-07T17:20:44.747146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/manticore-13b-chat-pyg-GPTQ",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/manticore-13b-chat-pyg-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-07T17:20:44.747146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
176,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/manticore-13b-chat-pyg-GPTQ## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/manticore-13b-chat-pyg-GPTQ on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-07T17:20:44.747146(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c0a65af09bc066185040934d6535ae173b6dbb89
|
## Publication Abstract
Colorectal cancer, a commonly diagnosed cancer in the elderly, often develops slowly from benign polyps called adenoma. The gut microbiota is believed to be directly involved in colorectal carcinogenesis. The identity and functional capacity of the adenoma- or carcinoma-related gut microbe(s), however, have not been surveyed in a comprehensive manner. Here we perform a metagenome-wide association study (MGWAS) on stools from advanced adenoma and carcinoma patients and from healthy subjects, revealing microbial genes, strains and functions enriched in each group. An analysis of potential risk factors indicates that high intake of red meat relative to fruits and vegetables appears to associate with outgrowth of bacteria that might contribute to a more hostile gut environment. These findings suggest that faecal microbiome-based strategies may be useful for early diagnosis and treatment of colorectal adenoma or carcinoma.
## Dataset
156 metagenomic shotgun-sequenced faecal samples from colorectal adenoma and carcinoma patients and healthy controls
### Configurations
- `presence-absence`
- `CLR`
## Usage
```python
dataset = load_dataset("wwydmanski/colorectal-carcinoma-microbiome-fengq", "presence-absence")
train_dataset, test_dataset = dataset['train'], dataset['test']
X_train = np.array(train_dataset['values'])
y_train = np.array(train_dataset['target'])
X_test = np.array(test_dataset['values'])
y_test = np.array(test_dataset['target'])
```
|
kamaludeen/fututech-colorectal-cancer
|
[
"task_categories:tabular-classification",
"size_categories:n<1K",
"microbiome",
"tabular",
"gut-microbiota",
"region:us"
] |
2023-09-12T23:36:16+00:00
|
{"size_categories": ["n<1K"], "task_categories": ["tabular-classification"], "pretty_name": "Colorectal Carcinoma Feng Q 2015", "tags": ["microbiome", "tabular", "gut-microbiota"]}
|
2023-09-13T00:17:03+00:00
|
[] |
[] |
TAGS
#task_categories-tabular-classification #size_categories-n<1K #microbiome #tabular #gut-microbiota #region-us
|
## Publication Abstract
Colorectal cancer, a commonly diagnosed cancer in the elderly, often develops slowly from benign polyps called adenoma. The gut microbiota is believed to be directly involved in colorectal carcinogenesis. The identity and functional capacity of the adenoma- or carcinoma-related gut microbe(s), however, have not been surveyed in a comprehensive manner. Here we perform a metagenome-wide association study (MGWAS) on stools from advanced adenoma and carcinoma patients and from healthy subjects, revealing microbial genes, strains and functions enriched in each group. An analysis of potential risk factors indicates that high intake of red meat relative to fruits and vegetables appears to associate with outgrowth of bacteria that might contribute to a more hostile gut environment. These findings suggest that faecal microbiome-based strategies may be useful for early diagnosis and treatment of colorectal adenoma or carcinoma.
## Dataset
156 metagenomic shotgun-sequenced faecal samples from colorectal adenoma and carcinoma patients and healthy controls
### Configurations
- 'presence-absence'
- 'CLR'
## Usage
|
[
"## Publication Abstract\n\nColorectal cancer, a commonly diagnosed cancer in the elderly, often develops slowly from benign polyps called adenoma. The gut microbiota is believed to be directly involved in colorectal carcinogenesis. The identity and functional capacity of the adenoma- or carcinoma-related gut microbe(s), however, have not been surveyed in a comprehensive manner. Here we perform a metagenome-wide association study (MGWAS) on stools from advanced adenoma and carcinoma patients and from healthy subjects, revealing microbial genes, strains and functions enriched in each group. An analysis of potential risk factors indicates that high intake of red meat relative to fruits and vegetables appears to associate with outgrowth of bacteria that might contribute to a more hostile gut environment. These findings suggest that faecal microbiome-based strategies may be useful for early diagnosis and treatment of colorectal adenoma or carcinoma.",
"## Dataset\n156 metagenomic shotgun-sequenced faecal samples from colorectal adenoma and carcinoma patients and healthy controls",
"### Configurations\n - 'presence-absence'\n - 'CLR'",
"## Usage"
] |
[
"TAGS\n#task_categories-tabular-classification #size_categories-n<1K #microbiome #tabular #gut-microbiota #region-us \n",
"## Publication Abstract\n\nColorectal cancer, a commonly diagnosed cancer in the elderly, often develops slowly from benign polyps called adenoma. The gut microbiota is believed to be directly involved in colorectal carcinogenesis. The identity and functional capacity of the adenoma- or carcinoma-related gut microbe(s), however, have not been surveyed in a comprehensive manner. Here we perform a metagenome-wide association study (MGWAS) on stools from advanced adenoma and carcinoma patients and from healthy subjects, revealing microbial genes, strains and functions enriched in each group. An analysis of potential risk factors indicates that high intake of red meat relative to fruits and vegetables appears to associate with outgrowth of bacteria that might contribute to a more hostile gut environment. These findings suggest that faecal microbiome-based strategies may be useful for early diagnosis and treatment of colorectal adenoma or carcinoma.",
"## Dataset\n156 metagenomic shotgun-sequenced faecal samples from colorectal adenoma and carcinoma patients and healthy controls",
"### Configurations\n - 'presence-absence'\n - 'CLR'",
"## Usage"
] |
[
41,
221,
33,
17,
3
] |
[
"passage: TAGS\n#task_categories-tabular-classification #size_categories-n<1K #microbiome #tabular #gut-microbiota #region-us \n## Publication Abstract\n\nColorectal cancer, a commonly diagnosed cancer in the elderly, often develops slowly from benign polyps called adenoma. The gut microbiota is believed to be directly involved in colorectal carcinogenesis. The identity and functional capacity of the adenoma- or carcinoma-related gut microbe(s), however, have not been surveyed in a comprehensive manner. Here we perform a metagenome-wide association study (MGWAS) on stools from advanced adenoma and carcinoma patients and from healthy subjects, revealing microbial genes, strains and functions enriched in each group. An analysis of potential risk factors indicates that high intake of red meat relative to fruits and vegetables appears to associate with outgrowth of bacteria that might contribute to a more hostile gut environment. These findings suggest that faecal microbiome-based strategies may be useful for early diagnosis and treatment of colorectal adenoma or carcinoma.## Dataset\n156 metagenomic shotgun-sequenced faecal samples from colorectal adenoma and carcinoma patients and healthy controls### Configurations\n - 'presence-absence'\n - 'CLR'## Usage"
] |
92efe3b3c3f527399cb3ef09836cc8aade0efebb
|
# Dataset of mai (Pokémon)
This is the dataset of mai (Pokémon), containing 114 images and their tags.
The core tags of this character are `black_hair, short_hair, breasts, blue_eyes, hair_ornament, mole, mole_under_mouth, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 114 | 111.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 114 | 66.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 247 | 129.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 114 | 101.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 247 | 182.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mai_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mai_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, hair_bow, smile, closed_mouth, pantyhose, pokemon_(creature), white_bow, blush, gothic_lolita, solo, black_dress, detached_sleeves, eyelashes, looking_at_viewer, simple_background |
| 1 | 15 |  |  |  |  |  | looking_at_viewer, 1girl, smile, solo, blush, closed_mouth, hood, jacket |
| 2 | 19 |  |  |  |  |  | 1girl, blush, hetero, pussy, sex, open_mouth, penis, 1boy, vaginal, nipples, spread_legs, tongue, cum, mosaic_censoring, nude, pantyhose, torn_clothes, uncensored |
| 3 | 6 |  |  |  |  |  | 1boy, 1girl, hetero, penis, blush, fellatio, solo_focus, cum_in_mouth, jacket, censored, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hair_bow | smile | closed_mouth | pantyhose | pokemon_(creature) | white_bow | blush | gothic_lolita | solo | black_dress | detached_sleeves | eyelashes | looking_at_viewer | simple_background | hood | jacket | hetero | pussy | sex | open_mouth | penis | 1boy | vaginal | nipples | spread_legs | tongue | cum | mosaic_censoring | nude | torn_clothes | uncensored | fellatio | solo_focus | cum_in_mouth | censored |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------|:---------------|:------------|:---------------------|:------------|:--------|:----------------|:-------|:--------------|:-------------------|:------------|:--------------------|:--------------------|:-------|:---------|:---------|:--------|:------|:-------------|:--------|:-------|:----------|:----------|:--------------|:---------|:------|:-------------------|:-------|:---------------|:-------------|:-----------|:-------------|:---------------|:-----------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | | X | X | | | | X | | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 19 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | | | | X | | | | | | X | | | X | X | | | | X | X | | | | | | | | | | X | X | X | X |
|
CyberHarem/mai_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-12T23:58:13+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:19:14+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of mai (Pokémon)
========================
This is the dataset of mai (Pokémon), containing 114 images and their tags.
The core tags of this character are 'black\_hair, short\_hair, breasts, blue\_eyes, hair\_ornament, mole, mole\_under\_mouth, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
d53733cf80e5059f189f38c1b542d581377629a0
|
# Dataset Card for "higgs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
jxie/higgs
|
[
"region:us"
] |
2023-09-13T00:10:20+00:00
|
{"dataset_info": {"features": [{"name": "inputs", "sequence": "float64"}, {"name": "label", "dtype": "float64"}], "splits": [{"name": "val_16k", "num_bytes": 3702368, "num_examples": 15688}, {"name": "train_10k", "num_bytes": 2360000, "num_examples": 10000}, {"name": "train_1k", "num_bytes": 236000, "num_examples": 1000}, {"name": "train_68k", "num_bytes": 14809236, "num_examples": 62751}, {"name": "train_100k", "num_bytes": 23600000, "num_examples": 100000}, {"name": "train", "num_bytes": 2478000000, "num_examples": 10500000}, {"name": "test", "num_bytes": 118000000, "num_examples": 500000}, {"name": "test_20k", "num_bytes": 4627960, "num_examples": 19610}, {"name": "train_63k", "num_bytes": 14809236, "num_examples": 62751}], "download_size": 2168393527, "dataset_size": 2660144800}}
|
2023-09-20T05:01:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "higgs"
More Information needed
|
[
"# Dataset Card for \"higgs\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"higgs\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"higgs\"\n\nMore Information needed"
] |
5bbbde5de78afc13af651f2a3f209ac0f785d3e6
|
# Dataset of ruri/ルリ (Pokémon)
This is the dataset of ruri/ルリ (Pokémon), containing 34 images and their tags.
The core tags of this character are `pink_hair, hat, blue_eyes, breasts, short_hair, bow, mole`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 21.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruri_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 15.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruri_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 70 | 31.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruri_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 20.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruri_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 70 | 39.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ruri_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ruri_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, smile, bag, open_mouth, skirt, hat_bow, long_sleeves, shirt, white_headwear |
| 1 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, nude, penis, solo_focus, blush, cum, nipples, smile, large_breasts, open_mouth, long_hair, pov, pussy, sex |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | smile | bag | open_mouth | skirt | hat_bow | long_sleeves | shirt | white_headwear | 1boy | hetero | nude | penis | solo_focus | cum | nipples | large_breasts | long_hair | pov | pussy | sex |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:------|:-------------|:--------|:----------|:---------------|:--------|:-----------------|:-------|:---------|:-------|:--------|:-------------|:------|:----------|:----------------|:------------|:------|:--------|:------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | X | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/ruri_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T00:10:52+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:13:20+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of ruri/ルリ (Pokémon)
============================
This is the dataset of ruri/ルリ (Pokémon), containing 34 images and their tags.
The core tags of this character are 'pink\_hair, hat, blue\_eyes, breasts, short\_hair, bow, mole', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
9c60ee49145d37546986df68464cd8ee0b542baa
|
# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-base
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TigerResearch/tigerbot-70b-base
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-70b-base](https://huggingface.co/TigerResearch/tigerbot-70b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TigerResearch__tigerbot-70b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T09:25:20.725516](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-70b-base/blob/main/results_2023-10-24T09-25-20.725516.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4872063758389262,
"em_stderr": 0.005118791512925044,
"f1": 0.5244914010067125,
"f1_stderr": 0.004935563924712029,
"acc": 0.5897264974960701,
"acc_stderr": 0.012277506705422794
},
"harness|drop|3": {
"em": 0.4872063758389262,
"em_stderr": 0.005118791512925044,
"f1": 0.5244914010067125,
"f1_stderr": 0.004935563924712029
},
"harness|gsm8k|5": {
"acc": 0.3775587566338135,
"acc_stderr": 0.013353150666358539
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487047
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TigerResearch__tigerbot-70b-base
|
[
"region:us"
] |
2023-09-13T00:25:28+00:00
|
{"pretty_name": "Evaluation run of TigerResearch/tigerbot-70b-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-70b-base](https://huggingface.co/TigerResearch/tigerbot-70b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TigerResearch__tigerbot-70b-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T09:25:20.725516](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-70b-base/blob/main/results_2023-10-24T09-25-20.725516.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4872063758389262,\n \"em_stderr\": 0.005118791512925044,\n \"f1\": 0.5244914010067125,\n \"f1_stderr\": 0.004935563924712029,\n \"acc\": 0.5897264974960701,\n \"acc_stderr\": 0.012277506705422794\n },\n \"harness|drop|3\": {\n \"em\": 0.4872063758389262,\n \"em_stderr\": 0.005118791512925044,\n \"f1\": 0.5244914010067125,\n \"f1_stderr\": 0.004935563924712029\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3775587566338135,\n \"acc_stderr\": 0.013353150666358539\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487047\n }\n}\n```", "repo_url": "https://huggingface.co/TigerResearch/tigerbot-70b-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|arc:challenge|25_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T09_25_20.725516", "path": ["**/details_harness|drop|3_2023-10-24T09-25-20.725516.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T09-25-20.725516.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T09_25_20.725516", "path": ["**/details_harness|gsm8k|5_2023-10-24T09-25-20.725516.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T09-25-20.725516.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hellaswag|10_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T01-25-14.196261.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T01-25-14.196261.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T01-25-14.196261.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T09_25_20.725516", "path": ["**/details_harness|winogrande|5_2023-10-24T09-25-20.725516.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T09-25-20.725516.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T01_25_14.196261", "path": ["results_2023-09-13T01-25-14.196261.parquet"]}, {"split": "2023_10_24T09_25_20.725516", "path": ["results_2023-10-24T09-25-20.725516.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T09-25-20.725516.parquet"]}]}]}
|
2023-10-24T08:25:34+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-base
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-base on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T09:25:20.725516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-base",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T09:25:20.725516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-base",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T09:25:20.725516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-base## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T09:25:20.725516(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d2ced757a99e2bfce2a18629a62e84ab09c94af9
|
This dataset is based on the [SAMSum](https://huggingface.co/datasets/samsum) dataset.
The summarization is generated by promoting to OpenAI ChatGPT API (gpt-3.5-turbo) with Temperature of 0.7.
The fine-tuned models outperforms the baselines in multiple metrics, demonstrating ChatGPT’s few-shot learning and summarization ability, and thus the potential to save human labor in summarization annotation.
Fine-tuned models also uploaded to hugging face.
|
BYC-Sophie/samsum-chatgpt-summary
|
[
"region:us"
] |
2023-09-13T00:35:16+00:00
|
{}
|
2023-09-13T03:12:18+00:00
|
[] |
[] |
TAGS
#region-us
|
This dataset is based on the SAMSum dataset.
The summarization is generated by promoting to OpenAI ChatGPT API (gpt-3.5-turbo) with Temperature of 0.7.
The fine-tuned models outperforms the baselines in multiple metrics, demonstrating ChatGPT’s few-shot learning and summarization ability, and thus the potential to save human labor in summarization annotation.
Fine-tuned models also uploaded to hugging face.
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
cd1cd88ec1aaf744f89cf3089c23a96128ee6816
|
# Dataset of melissa (Pokémon)
This is the dataset of melissa (Pokémon), containing 125 images and their tags.
The core tags of this character are `purple_hair, breasts, purple_eyes, long_hair, large_breasts, quad_tails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 91.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 125 | 60.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 235 | 104.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 125 | 84.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 235 | 134.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/melissa_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/melissa_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------|
| 0 | 54 |  |  |  |  |  | 1girl, solo, elbow_gloves, bare_shoulders, white_gloves, cleavage, smile, purple_dress, blush, lipstick |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | elbow_gloves | bare_shoulders | white_gloves | cleavage | smile | purple_dress | blush | lipstick |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:-----------------|:---------------|:-----------|:--------|:---------------|:--------|:-----------|
| 0 | 54 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/melissa_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T00:43:36+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:30:15+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of melissa (Pokémon)
============================
This is the dataset of melissa (Pokémon), containing 125 images and their tags.
The core tags of this character are 'purple\_hair, breasts, purple\_eyes, long\_hair, large\_breasts, quad\_tails', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
005be4b84ae77a830f8dddffa51fe46c4181a072
|
# Dataset Card for "SECOND_KQ_V2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
jjonhwa/SECOND_KQ_V2
|
[
"region:us"
] |
2023-09-13T00:44:49+00:00
|
{"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "ctxs", "list": [{"name": "score", "dtype": "float64"}, {"name": "text", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 686780736, "num_examples": 86975}], "download_size": 276955064, "dataset_size": 686780736}}
|
2023-09-13T06:04:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "SECOND_KQ_V2"
More Information needed
|
[
"# Dataset Card for \"SECOND_KQ_V2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"SECOND_KQ_V2\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"SECOND_KQ_V2\"\n\nMore Information needed"
] |
59b24f809007769f9dff2dd7044fc87bf962ebd4
|
# Dataset Card for "MATH_and_PRM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
approach0/MATH_and_PRM
|
[
"region:us"
] |
2023-09-13T00:47:03+00:00
|
{"dataset_info": {"features": [{"name": "src_path", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15325348.0, "num_examples": 13665}, {"name": "test", "num_bytes": 8685910.0, "num_examples": 8076}], "download_size": 9782004, "dataset_size": 24011258.0}}
|
2023-09-13T00:47:13+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "MATH_and_PRM"
More Information needed
|
[
"# Dataset Card for \"MATH_and_PRM\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"MATH_and_PRM\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"MATH_and_PRM\"\n\nMore Information needed"
] |
d04e6297705fd5aa7c811be2cd20bbf86c7c57e1
|
# Dataset Card for "MATH"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
approach0/MATH-no-asy
|
[
"region:us"
] |
2023-09-13T00:47:47+00:00
|
{"dataset_info": {"features": [{"name": "src_path", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5157479.0, "num_examples": 6217}, {"name": "test", "num_bytes": 3381766.0, "num_examples": 4212}], "download_size": 3505684, "dataset_size": 8539245.0}}
|
2023-09-13T00:47:49+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "MATH"
More Information needed
|
[
"# Dataset Card for \"MATH\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"MATH\"\n\nMore Information needed"
] |
[
6,
12
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"MATH\"\n\nMore Information needed"
] |
eb77dd7e8aaab10fae2ad4cd5a7233a4f5d4e1a1
|
# Dataset Card for "PRM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
approach0/PRM
|
[
"region:us"
] |
2023-09-13T00:48:08+00:00
|
{"dataset_info": {"features": [{"name": "src_path", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10167869.0, "num_examples": 7448}, {"name": "test", "num_bytes": 5304144.0, "num_examples": 3864}], "download_size": 5681426, "dataset_size": 15472013.0}}
|
2023-09-13T00:48:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PRM"
More Information needed
|
[
"# Dataset Card for \"PRM\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PRM\"\n\nMore Information needed"
] |
[
6,
12
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PRM\"\n\nMore Information needed"
] |
41a9122b93025b511cbd0cad2d32ec17237c852a
|
# Dataset of ran/ラン (Pokémon)
This is the dataset of ran/ラン (Pokémon), containing 71 images and their tags.
The core tags of this character are `black_hair, hair_bun, single_hair_bun, blue_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 71 | 38.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 71 | 29.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 92 | 45.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 71 | 36.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 92 | 55.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ran_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ran_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, open_mouth, looking_at_viewer, :d, hair_ribbon, pants, solo, blue_hair, full_body, long_sleeves, shoes, sidelocks, star_(symbol) |
| 1 | 9 |  |  |  |  |  | long_sleeves, open_mouth, 1girl, :d, blue_jacket, sidelocks, star_(symbol), tongue, 1boy, black_eyes, blue_pants, grey_eyes, hair_ribbon, pokemon_(creature), short_hair, solo |
| 2 | 6 |  |  |  |  |  | closed_mouth, outdoors, pants, short_hair, smile, 1boy, black_eyes, day, male_focus, pokemon_(creature), 1girl, looking_at_viewer, sitting, sky, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | looking_at_viewer | :d | hair_ribbon | pants | solo | blue_hair | full_body | long_sleeves | shoes | sidelocks | star_(symbol) | blue_jacket | tongue | 1boy | black_eyes | blue_pants | grey_eyes | pokemon_(creature) | short_hair | closed_mouth | outdoors | smile | day | male_focus | sitting | sky | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:-----|:--------------|:--------|:-------|:------------|:------------|:---------------|:--------|:------------|:----------------|:--------------|:---------|:-------|:-------------|:-------------|:------------|:---------------------|:-------------|:---------------|:-----------|:--------|:------|:-------------|:----------|:------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | X | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | | X | | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/ran_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T00:49:44+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:18:15+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of ran/ラン (Pokémon)
===========================
This is the dataset of ran/ラン (Pokémon), containing 71 images and their tags.
The core tags of this character are 'black\_hair, hair\_bun, single\_hair\_bun, blue\_eyes, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
4ef401964531549fe3122efe128412e8d6ff2692
|
## 介绍
基于self-instruct生成的多轮对话roleplay数据,约1k条不同的人格数据和对话
## 存在问题:
1.基于模型自身生成,所以roleplay存在模型本身价值观融入情况,导致roleplay不够真实,不够准确。
## 关于我自己:
我是小雨的开发者,小雨是一个情感ai,人格ai,如果对小雨感兴趣的话欢迎支持一下,她目前在bilibili直播,目前我仍在不断的改进。未来,“小雨”的目标是成为一个
具有真正人类情感的多模态通用人工智能。
url:https://live.bilibili.com/27357528?broadcast_type=0&is_room_feed=1&spm_id_from=333.999.live_users_card.0.click&live_from=86001
## 注:
使用本数据集请注明来源
## Introduction
This dataset consists of approximately 1,000 instances of multi-turn roleplay conversations generated based on self-instruction. Each instance involves different personalities engaging in dialogue.
## Challenges:
The data is generated by the model itself, leading to potential integration of the model's own values into the roleplay scenarios. This may result in roleplays that are not entirely realistic or accurate.
## About Myself:
I am the developer of Xiaoyu, an AI specializing in emotion and personality. If you're interested in Xiaoyu, feel free to show your support! She is currently live on Bilibili, and I am continuously working on improvements.
In the future, '小雨' aims to become a multimodal general artificial intelligence with genuine human emotions.
URL: https://live.bilibili.com/27357528?broadcast_type=0&is_room_feed=1&spm_id_from=333.999.live_users_card.0.click&live_from=86001
## Note:
When using this dataset, please provide proper attribution to the source.
## 引用
```
@misc{selfinstruct,
title={Self-Instruct: Aligning Language Model with Self Generated Instructions},
author={Wang, Yizhong and Kordi, Yeganeh and Mishra, Swaroop and Liu, Alisa and Smith, Noah A. and Khashabi, Daniel and Hajishirzi, Hannaneh},
journal={arXiv preprint arXiv:2212.10560},
year={2022}
}
```
|
Minami-su/roleplay_multiturn_chat_1k_zh_v0.1
|
[
"language:zh",
"roleplay",
"multiturn_chat",
"doi:10.57967/hf/1381",
"region:us"
] |
2023-09-13T00:54:10+00:00
|
{"language": ["zh"], "tags": ["roleplay", "multiturn_chat"]}
|
2023-12-16T04:29:42+00:00
|
[] |
[
"zh"
] |
TAGS
#language-Chinese #roleplay #multiturn_chat #doi-10.57967/hf/1381 #region-us
|
## 介绍
基于self-instruct生成的多轮对话roleplay数据,约1k条不同的人格数据和对话
## 存在问题:
1.基于模型自身生成,所以roleplay存在模型本身价值观融入情况,导致roleplay不够真实,不够准确。
## 关于我自己:
我是小雨的开发者,小雨是一个情感ai,人格ai,如果对小雨感兴趣的话欢迎支持一下,她目前在bilibili直播,目前我仍在不断的改进。未来,“小雨”的目标是成为一个
具有真正人类情感的多模态通用人工智能。
url:URL
## 注:
使用本数据集请注明来源
## Introduction
This dataset consists of approximately 1,000 instances of multi-turn roleplay conversations generated based on self-instruction. Each instance involves different personalities engaging in dialogue.
## Challenges:
The data is generated by the model itself, leading to potential integration of the model's own values into the roleplay scenarios. This may result in roleplays that are not entirely realistic or accurate.
## About Myself:
I am the developer of Xiaoyu, an AI specializing in emotion and personality. If you're interested in Xiaoyu, feel free to show your support! She is currently live on Bilibili, and I am continuously working on improvements.
In the future, '小雨' aims to become a multimodal general artificial intelligence with genuine human emotions.
URL: URL
## Note:
When using this dataset, please provide proper attribution to the source.
## 引用
|
[
"## 介绍\n\n基于self-instruct生成的多轮对话roleplay数据,约1k条不同的人格数据和对话",
"## 存在问题:\n1.基于模型自身生成,所以roleplay存在模型本身价值观融入情况,导致roleplay不够真实,不够准确。",
"## 关于我自己:\n我是小雨的开发者,小雨是一个情感ai,人格ai,如果对小雨感兴趣的话欢迎支持一下,她目前在bilibili直播,目前我仍在不断的改进。未来,“小雨”的目标是成为一个\n具有真正人类情感的多模态通用人工智能。\n\nurl:URL",
"## 注:\n使用本数据集请注明来源",
"## Introduction\nThis dataset consists of approximately 1,000 instances of multi-turn roleplay conversations generated based on self-instruction. Each instance involves different personalities engaging in dialogue.",
"## Challenges:\nThe data is generated by the model itself, leading to potential integration of the model's own values into the roleplay scenarios. This may result in roleplays that are not entirely realistic or accurate.",
"## About Myself:\nI am the developer of Xiaoyu, an AI specializing in emotion and personality. If you're interested in Xiaoyu, feel free to show your support! She is currently live on Bilibili, and I am continuously working on improvements.\nIn the future, '小雨' aims to become a multimodal general artificial intelligence with genuine human emotions.\n\nURL: URL",
"## Note:\nWhen using this dataset, please provide proper attribution to the source.",
"## 引用"
] |
[
"TAGS\n#language-Chinese #roleplay #multiturn_chat #doi-10.57967/hf/1381 #region-us \n",
"## 介绍\n\n基于self-instruct生成的多轮对话roleplay数据,约1k条不同的人格数据和对话",
"## 存在问题:\n1.基于模型自身生成,所以roleplay存在模型本身价值观融入情况,导致roleplay不够真实,不够准确。",
"## 关于我自己:\n我是小雨的开发者,小雨是一个情感ai,人格ai,如果对小雨感兴趣的话欢迎支持一下,她目前在bilibili直播,目前我仍在不断的改进。未来,“小雨”的目标是成为一个\n具有真正人类情感的多模态通用人工智能。\n\nurl:URL",
"## 注:\n使用本数据集请注明来源",
"## Introduction\nThis dataset consists of approximately 1,000 instances of multi-turn roleplay conversations generated based on self-instruction. Each instance involves different personalities engaging in dialogue.",
"## Challenges:\nThe data is generated by the model itself, leading to potential integration of the model's own values into the roleplay scenarios. This may result in roleplays that are not entirely realistic or accurate.",
"## About Myself:\nI am the developer of Xiaoyu, an AI specializing in emotion and personality. If you're interested in Xiaoyu, feel free to show your support! She is currently live on Bilibili, and I am continuously working on improvements.\nIn the future, '小雨' aims to become a multimodal general artificial intelligence with genuine human emotions.\n\nURL: URL",
"## Note:\nWhen using this dataset, please provide proper attribution to the source.",
"## 引用"
] |
[
33,
28,
32,
71,
13,
42,
48,
86,
18,
3
] |
[
"passage: TAGS\n#language-Chinese #roleplay #multiturn_chat #doi-10.57967/hf/1381 #region-us \n## 介绍\n\n基于self-instruct生成的多轮对话roleplay数据,约1k条不同的人格数据和对话## 存在问题:\n1.基于模型自身生成,所以roleplay存在模型本身价值观融入情况,导致roleplay不够真实,不够准确。## 关于我自己:\n我是小雨的开发者,小雨是一个情感ai,人格ai,如果对小雨感兴趣的话欢迎支持一下,她目前在bilibili直播,目前我仍在不断的改进。未来,“小雨”的目标是成为一个\n具有真正人类情感的多模态通用人工智能。\n\nurl:URL## 注:\n使用本数据集请注明来源## Introduction\nThis dataset consists of approximately 1,000 instances of multi-turn roleplay conversations generated based on self-instruction. Each instance involves different personalities engaging in dialogue.## Challenges:\nThe data is generated by the model itself, leading to potential integration of the model's own values into the roleplay scenarios. This may result in roleplays that are not entirely realistic or accurate.## About Myself:\nI am the developer of Xiaoyu, an AI specializing in emotion and personality. If you're interested in Xiaoyu, feel free to show your support! She is currently live on Bilibili, and I am continuously working on improvements.\nIn the future, '小雨' aims to become a multimodal general artificial intelligence with genuine human emotions.\n\nURL: URL## Note:\nWhen using this dataset, please provide proper attribution to the source.## 引用"
] |
a47ce0a6379fa606bea2b06740997aa15ccc801c
|
# Dataset Card for "pubmed_rapnonbiomedical_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
zxvix/pubmed_rapnonbiomedical_2
|
[
"region:us"
] |
2023-09-13T00:54:17+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "MedlineCitation", "struct": [{"name": "PMID", "dtype": "int32"}, {"name": "DateCompleted", "struct": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}, {"name": "NumberOfReferences", "dtype": "int32"}, {"name": "DateRevised", "struct": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}, {"name": "Article", "struct": [{"name": "Abstract", "struct": [{"name": "AbstractText", "dtype": "string"}]}, {"name": "ArticleTitle", "dtype": "string"}, {"name": "AuthorList", "struct": [{"name": "Author", "sequence": [{"name": "LastName", "dtype": "string"}, {"name": "ForeName", "dtype": "string"}, {"name": "Initials", "dtype": "string"}, {"name": "CollectiveName", "dtype": "string"}]}]}, {"name": "Language", "dtype": "string"}, {"name": "GrantList", "struct": [{"name": "Grant", "sequence": [{"name": "GrantID", "dtype": "string"}, {"name": "Agency", "dtype": "string"}, {"name": "Country", "dtype": "string"}]}]}, {"name": "PublicationTypeList", "struct": [{"name": "PublicationType", "sequence": "string"}]}]}, {"name": "MedlineJournalInfo", "struct": [{"name": "Country", "dtype": "string"}]}, {"name": "ChemicalList", "struct": [{"name": "Chemical", "sequence": [{"name": "RegistryNumber", "dtype": "string"}, {"name": "NameOfSubstance", "dtype": "string"}]}]}, {"name": "CitationSubset", "dtype": "string"}, {"name": "MeshHeadingList", "struct": [{"name": "MeshHeading", "sequence": [{"name": "DescriptorName", "dtype": "string"}, {"name": "QualifierName", "dtype": "string"}]}]}]}, {"name": "PubmedData", "struct": [{"name": "ArticleIdList", "sequence": [{"name": "ArticleId", "sequence": "string"}]}, {"name": "PublicationStatus", "dtype": "string"}, {"name": "History", "struct": [{"name": "PubMedPubDate", "sequence": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}]}, {"name": "ReferenceList", "sequence": [{"name": "Citation", "dtype": "string"}, {"name": "CitationId", "dtype": "int32"}]}]}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "original_text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 4112162.316, "num_examples": 982}], "download_size": 2354929, "dataset_size": 4112162.316}}
|
2023-09-13T01:23:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pubmed_rapnonbiomedical_2"
More Information needed
|
[
"# Dataset Card for \"pubmed_rapnonbiomedical_2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pubmed_rapnonbiomedical_2\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pubmed_rapnonbiomedical_2\"\n\nMore Information needed"
] |
6cc4c82203fa870050e3a436ef71a74c4cbfa6c0
|
# Dataset Card for Evaluation run of teknium/OpenHermes-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/teknium/OpenHermes-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [teknium/OpenHermes-13B](https://huggingface.co/teknium/OpenHermes-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teknium__OpenHermes-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T20:23:56.851767](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-13B/blob/main/results_2023-10-24T20-23-56.851767.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003984899328859061,
"em_stderr": 0.0006451805848102473,
"f1": 0.06597944630872499,
"f1_stderr": 0.0014689416324005639,
"acc": 0.4352676233998515,
"acc_stderr": 0.010457879214313065
},
"harness|drop|3": {
"em": 0.003984899328859061,
"em_stderr": 0.0006451805848102473,
"f1": 0.06597944630872499,
"f1_stderr": 0.0014689416324005639
},
"harness|gsm8k|5": {
"acc": 0.11599696739954511,
"acc_stderr": 0.008820485491442487
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183644
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_teknium__OpenHermes-13B
|
[
"region:us"
] |
2023-09-13T00:57:13+00:00
|
{"pretty_name": "Evaluation run of teknium/OpenHermes-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [teknium/OpenHermes-13B](https://huggingface.co/teknium/OpenHermes-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teknium__OpenHermes-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T20:23:56.851767](https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-13B/blob/main/results_2023-10-24T20-23-56.851767.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003984899328859061,\n \"em_stderr\": 0.0006451805848102473,\n \"f1\": 0.06597944630872499,\n \"f1_stderr\": 0.0014689416324005639,\n \"acc\": 0.4352676233998515,\n \"acc_stderr\": 0.010457879214313065\n },\n \"harness|drop|3\": {\n \"em\": 0.003984899328859061,\n \"em_stderr\": 0.0006451805848102473,\n \"f1\": 0.06597944630872499,\n \"f1_stderr\": 0.0014689416324005639\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11599696739954511,\n \"acc_stderr\": 0.008820485491442487\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183644\n }\n}\n```", "repo_url": "https://huggingface.co/teknium/OpenHermes-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|arc:challenge|25_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|arc:challenge|25_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T20_23_56.851767", "path": ["**/details_harness|drop|3_2023-10-24T20-23-56.851767.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T20-23-56.851767.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T20_23_56.851767", "path": ["**/details_harness|gsm8k|5_2023-10-24T20-23-56.851767.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T20-23-56.851767.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hellaswag|10_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hellaswag|10_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T01-56-57.835904.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T02-06-09.559271.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T02-06-09.559271.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T20_23_56.851767", "path": ["**/details_harness|winogrande|5_2023-10-24T20-23-56.851767.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T20-23-56.851767.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T01_56_57.835904", "path": ["results_2023-09-13T01-56-57.835904.parquet"]}, {"split": "2023_09_13T02_06_09.559271", "path": ["results_2023-09-13T02-06-09.559271.parquet"]}, {"split": "2023_10_24T20_23_56.851767", "path": ["results_2023-10-24T20-23-56.851767.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T20-23-56.851767.parquet"]}]}]}
|
2023-10-24T19:24:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of teknium/OpenHermes-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model teknium/OpenHermes-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T20:23:56.851767(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of teknium/OpenHermes-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model teknium/OpenHermes-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T20:23:56.851767(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of teknium/OpenHermes-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model teknium/OpenHermes-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T20:23:56.851767(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of teknium/OpenHermes-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model teknium/OpenHermes-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T20:23:56.851767(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
099a7e7eb765f8fd07a088583866357c40129e31
|
这是一个R-18(含R-18G)简体中文小说数据集,来自Pixiv网站
共有145163本,数据截止北京时间2023年9月12日晚7点
存储格式为Pixiv/userID/ID.txt,数据为txt正文,Pixiv/userID/ID-meta.txt,数据为额外信息(包括tag、title、Description等)
数据未经过清洗,可能包含低质量内容。
|
wuliangfo/Chinese-Pixiv-Novel
|
[
"license:openrail",
"region:us"
] |
2023-09-13T01:03:57+00:00
|
{"license": "openrail"}
|
2023-09-18T10:27:13+00:00
|
[] |
[] |
TAGS
#license-openrail #region-us
|
这是一个R-18(含R-18G)简体中文小说数据集,来自Pixiv网站
共有145163本,数据截止北京时间2023年9月12日晚7点
存储格式为Pixiv/userID/URL,数据为txt正文,Pixiv/userID/URL,数据为额外信息(包括tag、title、Description等)
数据未经过清洗,可能包含低质量内容。
|
[] |
[
"TAGS\n#license-openrail #region-us \n"
] |
[
12
] |
[
"passage: TAGS\n#license-openrail #region-us \n"
] |
6f4faa17152807095374e9bbe7f8d078f202e5cf
|
# Dataset of ninomiya_asuka/二宮飛鳥/니노미야아스카 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ninomiya_asuka/二宮飛鳥/니노미야아스카 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `multicolored_hair, two-tone_hair, long_hair, purple_eyes, orange_hair, bangs, hair_between_eyes, breasts, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 648.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 389.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1235 | 837.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 583.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1235 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ninomiya_asuka_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ninomiya_asuka_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, solo, collarbone, looking_at_viewer, blush, navel, choker, bracelet, red_hair, small_breasts, cleavage, twin_braids, black_bikini, medium_breasts, white_background, simple_background, smile, cowboy_shot, open_mouth, pink_hair |
| 1 | 5 |  |  |  |  |  | 1girl, choker, looking_at_viewer, solo, collarbone, purple_hair, simple_background, upper_body, white_background, long_sleeves, open_mouth, shiny_hair, :d, ahoge, black_shirt, blush, necklace, plaid, sketch |
| 2 | 6 |  |  |  |  |  | 1girl, blue_hair, long_sleeves, shiny_hair, solo, very_long_hair, white_shirt, blue_skirt, frills, looking_at_viewer, miniskirt, underbust, dress_shirt, layered_skirt, black_thighhighs, blush, hair_flower, on_back, simple_background, white_ascot, white_background, zettai_ryouiki |
| 3 | 15 |  |  |  |  |  | 1girl, solo, looking_at_viewer, skirt, thighhighs, smile, beret, braid, detached_sleeves, feathers, necktie, pink_hair |
| 4 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, fingerless_gloves, hood_up, red_hair, midriff, choker, elbow_gloves, navel, red_cape, smile, braid, hooded_cloak, red_cloak, closed_mouth, nail_polish, red_skirt, belt, black_gloves, chain, holding, miniskirt, small_breasts, standing, sword |
| 5 | 13 |  |  |  |  |  | enmaided, wrist_cuffs, 1girl, blush, cat_ears, solo, black_dress, looking_at_viewer, neck_ribbon, white_apron, black_ribbon, blonde_hair, puffy_short_sleeves, simple_background, waist_apron, frilled_apron, fake_animal_ears, white_background, white_thighhighs, closed_mouth, detached_collar, small_breasts, maid_apron, smile, zettai_ryouiki |
| 6 | 15 |  |  |  |  |  | 1girl, blush, nipples, small_breasts, collarbone, open_mouth, 1boy, completely_nude, hetero, simple_background, white_background, looking_at_viewer, navel, solo_focus, sweat, mosaic_censoring |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | collarbone | looking_at_viewer | blush | navel | choker | bracelet | red_hair | small_breasts | cleavage | twin_braids | black_bikini | medium_breasts | white_background | simple_background | smile | cowboy_shot | open_mouth | pink_hair | purple_hair | upper_body | long_sleeves | shiny_hair | :d | ahoge | black_shirt | necklace | plaid | sketch | blue_hair | very_long_hair | white_shirt | blue_skirt | frills | miniskirt | underbust | dress_shirt | layered_skirt | black_thighhighs | hair_flower | on_back | white_ascot | zettai_ryouiki | skirt | thighhighs | beret | braid | detached_sleeves | feathers | necktie | fingerless_gloves | hood_up | midriff | elbow_gloves | red_cape | hooded_cloak | red_cloak | closed_mouth | nail_polish | red_skirt | belt | black_gloves | chain | holding | standing | sword | enmaided | wrist_cuffs | cat_ears | black_dress | neck_ribbon | white_apron | black_ribbon | blonde_hair | puffy_short_sleeves | waist_apron | frilled_apron | fake_animal_ears | white_thighhighs | detached_collar | maid_apron | nipples | 1boy | completely_nude | hetero | solo_focus | sweat | mosaic_censoring |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:--------------------|:--------|:--------|:---------|:-----------|:-----------|:----------------|:-----------|:--------------|:---------------|:-----------------|:-------------------|:--------------------|:--------|:--------------|:-------------|:------------|:--------------|:-------------|:---------------|:-------------|:-----|:--------|:--------------|:-----------|:--------|:---------|:------------|:-----------------|:--------------|:-------------|:---------|:------------|:------------|:--------------|:----------------|:-------------------|:--------------|:----------|:--------------|:-----------------|:--------|:-------------|:--------|:--------|:-------------------|:-----------|:----------|:--------------------|:----------|:----------|:---------------|:-----------|:---------------|:------------|:---------------|:--------------|:------------|:-------|:---------------|:--------|:----------|:-----------|:--------|:-----------|:--------------|:-----------|:--------------|:--------------|:--------------|:---------------|:--------------|:----------------------|:--------------|:----------------|:-------------------|:-------------------|:------------------|:-------------|:----------|:-------|:------------------|:---------|:-------------|:--------|:-------------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | X | X | | | | | | | | | | X | X | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | X | | X | X | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 13 |  |  |  |  |  | X | X | | X | X | | | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | | X | X | X | X | | | | X | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
CyberHarem/ninomiya_asuka_idolmastercinderellagirls
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T01:09:01+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T14:29:22+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of ninomiya\_asuka/二宮飛鳥/니노미야아스카 (THE iDOLM@STER: Cinderella Girls)
==========================================================================
This is the dataset of ninomiya\_asuka/二宮飛鳥/니노미야아스카 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are 'multicolored\_hair, two-tone\_hair, long\_hair, purple\_eyes, orange\_hair, bangs, hair\_between\_eyes, breasts, brown\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
700c98abe9d3c9a443284b55600c4a6131005d42
|
# Dataset Card for Evaluation run of wei123602/llama-13b-FINETUNE3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wei123602/llama-13b-FINETUNE3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [wei123602/llama-13b-FINETUNE3](https://huggingface.co/wei123602/llama-13b-FINETUNE3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T20:56:48.132337](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3/blob/main/results_2023-10-25T20-56-48.132337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10360738255033557,
"em_stderr": 0.003120930790921416,
"f1": 0.14798552852348912,
"f1_stderr": 0.003214007613815376,
"acc": 0.4442352766589695,
"acc_stderr": 0.010435544785566055
},
"harness|drop|3": {
"em": 0.10360738255033557,
"em_stderr": 0.003120930790921416,
"f1": 0.14798552852348912,
"f1_stderr": 0.003214007613815376
},
"harness|gsm8k|5": {
"acc": 0.12130401819560273,
"acc_stderr": 0.00899288849727557
},
"harness|winogrande|5": {
"acc": 0.7671665351223362,
"acc_stderr": 0.01187820107385654
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3
|
[
"region:us"
] |
2023-09-13T01:24:54+00:00
|
{"pretty_name": "Evaluation run of wei123602/llama-13b-FINETUNE3", "dataset_summary": "Dataset automatically created during the evaluation run of model [wei123602/llama-13b-FINETUNE3](https://huggingface.co/wei123602/llama-13b-FINETUNE3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T20:56:48.132337](https://huggingface.co/datasets/open-llm-leaderboard/details_wei123602__llama-13b-FINETUNE3/blob/main/results_2023-10-25T20-56-48.132337.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10360738255033557,\n \"em_stderr\": 0.003120930790921416,\n \"f1\": 0.14798552852348912,\n \"f1_stderr\": 0.003214007613815376,\n \"acc\": 0.4442352766589695,\n \"acc_stderr\": 0.010435544785566055\n },\n \"harness|drop|3\": {\n \"em\": 0.10360738255033557,\n \"em_stderr\": 0.003120930790921416,\n \"f1\": 0.14798552852348912,\n \"f1_stderr\": 0.003214007613815376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12130401819560273,\n \"acc_stderr\": 0.00899288849727557\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.01187820107385654\n }\n}\n```", "repo_url": "https://huggingface.co/wei123602/llama-13b-FINETUNE3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|arc:challenge|25_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T20_56_48.132337", "path": ["**/details_harness|drop|3_2023-10-25T20-56-48.132337.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T20-56-48.132337.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T20_56_48.132337", "path": ["**/details_harness|gsm8k|5_2023-10-25T20-56-48.132337.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T20-56-48.132337.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hellaswag|10_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T02-24-38.254919.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T02-24-38.254919.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T20_56_48.132337", "path": ["**/details_harness|winogrande|5_2023-10-25T20-56-48.132337.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T20-56-48.132337.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T02_24_38.254919", "path": ["results_2023-09-13T02-24-38.254919.parquet"]}, {"split": "2023_10_25T20_56_48.132337", "path": ["results_2023-10-25T20-56-48.132337.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T20-56-48.132337.parquet"]}]}]}
|
2023-10-25T19:57:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of wei123602/llama-13b-FINETUNE3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model wei123602/llama-13b-FINETUNE3 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-25T20:56:48.132337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of wei123602/llama-13b-FINETUNE3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model wei123602/llama-13b-FINETUNE3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T20:56:48.132337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of wei123602/llama-13b-FINETUNE3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model wei123602/llama-13b-FINETUNE3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T20:56:48.132337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of wei123602/llama-13b-FINETUNE3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model wei123602/llama-13b-FINETUNE3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T20:56:48.132337(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3bcf7ee41c3fc9d29c41df0395bd06813c10f5c0
|
# Dataset Card for "sven"
Unofficial, not affiliated with the authors.
Paper: https://arxiv.org/abs/2302.05319
Repository: https://github.com/eth-sri/sven
|
benjis/sven
|
[
"arxiv:2302.05319",
"region:us"
] |
2023-09-13T01:27:09+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}]}], "dataset_info": {"features": [{"name": "func_name", "dtype": "string"}, {"name": "func_src_before", "dtype": "string"}, {"name": "func_src_after", "dtype": "string"}, {"name": "line_changes", "struct": [{"name": "deleted", "list": [{"name": "line_no", "dtype": "int64"}, {"name": "char_start", "dtype": "int64"}, {"name": "char_end", "dtype": "int64"}, {"name": "line", "dtype": "string"}]}, {"name": "added", "list": [{"name": "line_no", "dtype": "int64"}, {"name": "char_start", "dtype": "int64"}, {"name": "char_end", "dtype": "int64"}, {"name": "line", "dtype": "string"}]}]}, {"name": "char_changes", "struct": [{"name": "deleted", "list": [{"name": "char_start", "dtype": "int64"}, {"name": "char_end", "dtype": "int64"}, {"name": "chars", "dtype": "string"}]}, {"name": "added", "list": [{"name": "char_start", "dtype": "int64"}, {"name": "char_end", "dtype": "int64"}, {"name": "chars", "dtype": "string"}]}]}, {"name": "commit_link", "dtype": "string"}, {"name": "file_name", "dtype": "string"}, {"name": "vul_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4961153, "num_examples": 720}, {"name": "val", "num_bytes": 621398, "num_examples": 83}], "download_size": 2246744, "dataset_size": 5582551}}
|
2023-09-18T17:25:16+00:00
|
[
"2302.05319"
] |
[] |
TAGS
#arxiv-2302.05319 #region-us
|
# Dataset Card for "sven"
Unofficial, not affiliated with the authors.
Paper: URL
Repository: URL
|
[
"# Dataset Card for \"sven\"\n\nUnofficial, not affiliated with the authors.\n\n Paper: URL\n Repository: URL"
] |
[
"TAGS\n#arxiv-2302.05319 #region-us \n",
"# Dataset Card for \"sven\"\n\nUnofficial, not affiliated with the authors.\n\n Paper: URL\n Repository: URL"
] |
[
15,
28
] |
[
"passage: TAGS\n#arxiv-2302.05319 #region-us \n# Dataset Card for \"sven\"\n\nUnofficial, not affiliated with the authors.\n\n Paper: URL\n Repository: URL"
] |
2c293913177ae0a381950e2f0cf4c1880067e8a9
|
## 介绍
基于self-instruct,evol—instruct,辅以联网学习生成的数据,指令由简单到复杂,input里的分析为联网学习的分析结果
## 存在问题:
1.指令不一定完全正确,但可以不断迭代
## Introduction
Based on self-instruct and evol-instruct, supplemented by data generated through online learning, the instructions range from simple to complex. The analysis in the input is the result of online learning analysis.
## Challenges:
1. Instructions may not be entirely accurate, but can be iterated upon continuously.
## 引用
```
@misc{selfinstruct,
title={Self-Instruct: Aligning Language Model with Self Generated Instructions},
author={Wang, Yizhong and Kordi, Yeganeh and Mishra, Swaroop and Liu, Alisa and Smith, Noah A. and Khashabi, Daniel and Hajishirzi, Hannaneh},
journal={arXiv preprint arXiv:2212.10560},
year={2022}
}
```
```
@article{xu2023wizardlm,
title={Wizardlm: Empowering large language models to follow complex instructions},
author={Xu, Can and Sun, Qingfeng and Zheng, Kai and Geng, Xiubo and Zhao, Pu and Feng, Jiazhan and Tao, Chongyang and Jiang, Daxin},
journal={arXiv preprint arXiv:2304.12244},
year={2023}
}
```
|
Minami-su/Complex_Evol_Network_Instruct_v0.1
|
[
"language:zh",
"evol",
"online",
"complex",
"doi:10.57967/hf/1397",
"region:us"
] |
2023-09-13T01:34:43+00:00
|
{"language": ["zh"], "tags": ["evol", "online", "complex"]}
|
2023-09-13T01:45:27+00:00
|
[] |
[
"zh"
] |
TAGS
#language-Chinese #evol #online #complex #doi-10.57967/hf/1397 #region-us
|
## 介绍
基于self-instruct,evol—instruct,辅以联网学习生成的数据,指令由简单到复杂,input里的分析为联网学习的分析结果
## 存在问题:
1.指令不一定完全正确,但可以不断迭代
## Introduction
Based on self-instruct and evol-instruct, supplemented by data generated through online learning, the instructions range from simple to complex. The analysis in the input is the result of online learning analysis.
## Challenges:
1. Instructions may not be entirely accurate, but can be iterated upon continuously.
## 引用
|
[
"## 介绍\n\n基于self-instruct,evol—instruct,辅以联网学习生成的数据,指令由简单到复杂,input里的分析为联网学习的分析结果",
"## 存在问题:\n1.指令不一定完全正确,但可以不断迭代",
"## Introduction\n\nBased on self-instruct and evol-instruct, supplemented by data generated through online learning, the instructions range from simple to complex. The analysis in the input is the result of online learning analysis.",
"## Challenges:\n1. Instructions may not be entirely accurate, but can be iterated upon continuously.",
"## 引用"
] |
[
"TAGS\n#language-Chinese #evol #online #complex #doi-10.57967/hf/1397 #region-us \n",
"## 介绍\n\n基于self-instruct,evol—instruct,辅以联网学习生成的数据,指令由简单到复杂,input里的分析为联网学习的分析结果",
"## 存在问题:\n1.指令不一定完全正确,但可以不断迭代",
"## Introduction\n\nBased on self-instruct and evol-instruct, supplemented by data generated through online learning, the instructions range from simple to complex. The analysis in the input is the result of online learning analysis.",
"## Challenges:\n1. Instructions may not be entirely accurate, but can be iterated upon continuously.",
"## 引用"
] |
[
31,
41,
16,
48,
24,
3
] |
[
"passage: TAGS\n#language-Chinese #evol #online #complex #doi-10.57967/hf/1397 #region-us \n## 介绍\n\n基于self-instruct,evol—instruct,辅以联网学习生成的数据,指令由简单到复杂,input里的分析为联网学习的分析结果## 存在问题:\n1.指令不一定完全正确,但可以不断迭代## Introduction\n\nBased on self-instruct and evol-instruct, supplemented by data generated through online learning, the instructions range from simple to complex. The analysis in the input is the result of online learning analysis.## Challenges:\n1. Instructions may not be entirely accurate, but can be iterated upon continuously.## 引用"
] |
c99a757d1e35bf407a06159078f68c1508010c89
|
# Dataset Card for "ICD_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
adalbertojunior/ICD_dataset
|
[
"region:us"
] |
2023-09-13T01:49:47+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 418410601, "num_examples": 39354}, {"name": "test", "num_bytes": 53529100, "num_examples": 5000}, {"name": "validation", "num_bytes": 52947510, "num_examples": 5000}], "download_size": 301971173, "dataset_size": 524887211}}
|
2023-09-13T20:59:45+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ICD_dataset"
More Information needed
|
[
"# Dataset Card for \"ICD_dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ICD_dataset\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ICD_dataset\"\n\nMore Information needed"
] |
345707f95b13745724c23e5e76e849c56fc482f2
|
# Dataset of plumeri/プルメリ (Pokémon)
This is the dataset of plumeri/プルメリ (Pokémon), containing 215 images and their tags.
The core tags of this character are `pink_hair, multicolored_hair, blonde_hair, two-tone_hair, long_hair, yellow_eyes, hair_ornament, quad_tails, breasts, eyeshadow, skull_hair_ornament, dark_skin`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 215 | 201.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/plumeri_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 215 | 128.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/plumeri_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 452 | 240.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/plumeri_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 215 | 183.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/plumeri_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 452 | 313.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/plumeri_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/plumeri_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, crop_top, makeup, tank_top, navel, midriff, stomach_tattoo, black_pants, solo, wristband, looking_at_viewer, closed_mouth, skull_necklace, pubic_tattoo, bare_shoulders, frown |
| 1 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, open_mouth, cum_in_pussy, makeup, pokephilia, solo_focus, sweat, tank_top, tongue_out, ahegao, doggystyle, necklace, penis, pokemon_(creature), saliva, sex_from_behind, uncensored, vaginal, anus, ass_grab, bestiality, fucked_silly, medium_breasts, rolling_eyes, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | crop_top | makeup | tank_top | navel | midriff | stomach_tattoo | black_pants | solo | wristband | looking_at_viewer | closed_mouth | skull_necklace | pubic_tattoo | bare_shoulders | frown | 1boy | blush | hetero | open_mouth | cum_in_pussy | pokephilia | solo_focus | sweat | tongue_out | ahegao | doggystyle | necklace | penis | pokemon_(creature) | saliva | sex_from_behind | uncensored | vaginal | anus | ass_grab | bestiality | fucked_silly | medium_breasts | rolling_eyes | spread_legs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:---------|:-----------|:--------|:----------|:-----------------|:--------------|:-------|:------------|:--------------------|:---------------|:-----------------|:---------------|:-----------------|:--------|:-------|:--------|:---------|:-------------|:---------------|:-------------|:-------------|:--------|:-------------|:---------|:-------------|:-----------|:--------|:---------------------|:---------|:------------------|:-------------|:----------|:-------|:-----------|:-------------|:---------------|:-----------------|:---------------|:--------------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/plumeri_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T01:51:16+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T23:08:25+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of plumeri/プルメリ (Pokémon)
=================================
This is the dataset of plumeri/プルメリ (Pokémon), containing 215 images and their tags.
The core tags of this character are 'pink\_hair, multicolored\_hair, blonde\_hair, two-tone\_hair, long\_hair, yellow\_eyes, hair\_ornament, quad\_tails, breasts, eyeshadow, skull\_hair\_ornament, dark\_skin', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
8fba7ee941523e7da69f4ff882b52e5566d24288
|
## 数据集内容说明:
包含700+个阿里云OpenAPI的信息;包括Dataworks,EMR,DataLake,Maxcompute,Hologram,实时计算Flink版,QuickBI,DTS等多个产品的公开Open API信息。
## 样例
```
{
"systemPrompt": 你是一个函数筛选助理,如果与问题相关的话,您可以使用下面的函数来获取更多数据以回答用户提出的问题:{"function": "UpdateTicketNum", "description": "对用于免登嵌入报表的指定的ticket进行更新票据数量操作。", "arguments": [{"name": "Ticket", "type": "string", "description": "三方嵌入的票据值,即URL中的accessTicket值。"}, {"name": "TicketNum", "type": "integer", "description": "票据数。\n- 取值范围:1~99998,建议值为1。"}]}{"function": "DeregisterLocation", "description": "取消Location注册。", "arguments":[{"name": "LocationId", "type": "string", "description": "Location ID\n> 您可以调用接口RegisterLocation获取Location ID。"}]}{"function": "SyncMemberBehaviorInfo", "description": "保存会员行为信息。", "arguments": [{"name": "body", "type": "object", "description": "请求参数"}]}请以如下格式回复::{"function":"function_name","arguments": {"argument1": value1,"argument2": value2}},
"userPrompt": "我想将免登嵌入报表的票据值为"abcd1234"的票据数量更新为10。",
"assistantResponse":
{
"function": "UpdateTicketNum",
"arguments": [
{
"Ticket": "abcd1234",
"TicketNum": 10
}
]
}
}
```
### 字段
```
systemPrompt: 指令
userPrompt: 用户输入
assistantResponse: 输出
```
## 数据集用途
- 函数调用理解: 通过分析对话中的函数调用信息,让语言模型更好地理解函数之间的关系,从而提高其代码理解能力。
- 阿里云OpenAPI:基于数据中阿里云OpenAPI的信息,模型可以更好的理解其相关信息以及调用方式,在开发过程中提供更合适的函数建议。
如有任何问题或需要进一步帮助,请随时联系我们。感谢您对函数调用数据集及其应用的兴趣与支持!
|
Deepexi/function-calling-small
|
[
"task_categories:feature-extraction",
"size_categories:10K<n<100K",
"language:zh",
"license:cc-by-4.0",
"code",
"region:us"
] |
2023-09-13T02:07:25+00:00
|
{"language": ["zh"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["feature-extraction"], "tags": ["code"]}
|
2023-09-13T11:03:16+00:00
|
[] |
[
"zh"
] |
TAGS
#task_categories-feature-extraction #size_categories-10K<n<100K #language-Chinese #license-cc-by-4.0 #code #region-us
|
## 数据集内容说明:
包含700+个阿里云OpenAPI的信息;包括Dataworks,EMR,DataLake,Maxcompute,Hologram,实时计算Flink版,QuickBI,DTS等多个产品的公开Open API信息。
## 样例
### 字段
## 数据集用途
- 函数调用理解: 通过分析对话中的函数调用信息,让语言模型更好地理解函数之间的关系,从而提高其代码理解能力。
- 阿里云OpenAPI:基于数据中阿里云OpenAPI的信息,模型可以更好的理解其相关信息以及调用方式,在开发过程中提供更合适的函数建议。
如有任何问题或需要进一步帮助,请随时联系我们。感谢您对函数调用数据集及其应用的兴趣与支持!
|
[
"## 数据集内容说明:\n包含700+个阿里云OpenAPI的信息;包括Dataworks,EMR,DataLake,Maxcompute,Hologram,实时计算Flink版,QuickBI,DTS等多个产品的公开Open API信息。",
"## 样例",
"### 字段",
"## 数据集用途\n\n- 函数调用理解: 通过分析对话中的函数调用信息,让语言模型更好地理解函数之间的关系,从而提高其代码理解能力。\n- 阿里云OpenAPI:基于数据中阿里云OpenAPI的信息,模型可以更好的理解其相关信息以及调用方式,在开发过程中提供更合适的函数建议。\n\n如有任何问题或需要进一步帮助,请随时联系我们。感谢您对函数调用数据集及其应用的兴趣与支持!"
] |
[
"TAGS\n#task_categories-feature-extraction #size_categories-10K<n<100K #language-Chinese #license-cc-by-4.0 #code #region-us \n",
"## 数据集内容说明:\n包含700+个阿里云OpenAPI的信息;包括Dataworks,EMR,DataLake,Maxcompute,Hologram,实时计算Flink版,QuickBI,DTS等多个产品的公开Open API信息。",
"## 样例",
"### 字段",
"## 数据集用途\n\n- 函数调用理解: 通过分析对话中的函数调用信息,让语言模型更好地理解函数之间的关系,从而提高其代码理解能力。\n- 阿里云OpenAPI:基于数据中阿里云OpenAPI的信息,模型可以更好的理解其相关信息以及调用方式,在开发过程中提供更合适的函数建议。\n\n如有任何问题或需要进一步帮助,请随时联系我们。感谢您对函数调用数据集及其应用的兴趣与支持!"
] |
[
46,
58,
4,
5,
104
] |
[
"passage: TAGS\n#task_categories-feature-extraction #size_categories-10K<n<100K #language-Chinese #license-cc-by-4.0 #code #region-us \n## 数据集内容说明:\n包含700+个阿里云OpenAPI的信息;包括Dataworks,EMR,DataLake,Maxcompute,Hologram,实时计算Flink版,QuickBI,DTS等多个产品的公开Open API信息。## 样例### 字段## 数据集用途\n\n- 函数调用理解: 通过分析对话中的函数调用信息,让语言模型更好地理解函数之间的关系,从而提高其代码理解能力。\n- 阿里云OpenAPI:基于数据中阿里云OpenAPI的信息,模型可以更好的理解其相关信息以及调用方式,在开发过程中提供更合适的函数建议。\n\n如有任何问题或需要进一步帮助,请随时联系我们。感谢您对函数调用数据集及其应用的兴趣与支持!"
] |
9ce2c7778f74c37d4ff65d65b771510b0b4f9375
|
# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_platypus_15_converted_safetensors
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/_fsx_shared-falcon-180B_platypus_15_converted_safetensors
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [_fsx_shared-falcon-180B_platypus_15_converted_safetensors](https://huggingface.co/_fsx_shared-falcon-180B_platypus_15_converted_safetensors) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T03:07:15.932697](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors/blob/main/results_2023-09-13T03-07-15.932697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6795378405588016,
"acc_stderr": 0.03169754857202292,
"acc_norm": 0.6832295460165766,
"acc_norm_stderr": 0.031667751099416844,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.01711581563241818,
"mc2": 0.5565099709811991,
"mc2_stderr": 0.015263307246122862
},
"harness|arc:challenge|25": {
"acc": 0.6100682593856656,
"acc_stderr": 0.01425295984889289,
"acc_norm": 0.6569965870307167,
"acc_norm_stderr": 0.013872423223718166
},
"harness|hellaswag|10": {
"acc": 0.7210714997012547,
"acc_stderr": 0.004475557360359705,
"acc_norm": 0.8919537940649273,
"acc_norm_stderr": 0.003098043101775829
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.031639106653672915,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.031639106653672915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154957,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154957
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.03123475237772117,
"acc_norm": 0.8,
"acc_norm_stderr": 0.03123475237772117
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215293,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215293
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970562,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970562
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649416,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931048,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931048
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017016,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017016
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383602,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383602
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8625954198473282,
"acc_stderr": 0.030194823996804475,
"acc_norm": 0.8625954198473282,
"acc_norm_stderr": 0.030194823996804475
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5803571428571429,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.5803571428571429,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8492975734355045,
"acc_stderr": 0.012793420883120802,
"acc_norm": 0.8492975734355045,
"acc_norm_stderr": 0.012793420883120802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5921787709497207,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.5921787709497207,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.021751866060815864,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.021751866060815864
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5247718383311604,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.5247718383311604,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.02448448716291397,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.02448448716291397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.01711581563241818,
"mc2": 0.5565099709811991,
"mc2_stderr": 0.015263307246122862
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors
|
[
"region:us"
] |
2023-09-13T02:07:27+00:00
|
{"pretty_name": "Evaluation run of _fsx_shared-falcon-180B_platypus_15_converted_safetensors", "dataset_summary": "Dataset automatically created during the evaluation run of model [_fsx_shared-falcon-180B_platypus_15_converted_safetensors](https://huggingface.co/_fsx_shared-falcon-180B_platypus_15_converted_safetensors) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-13T03:07:15.932697](https://huggingface.co/datasets/open-llm-leaderboard/details__fsx_shared-falcon-180B_platypus_15_converted_safetensors/blob/main/results_2023-09-13T03-07-15.932697.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6795378405588016,\n \"acc_stderr\": 0.03169754857202292,\n \"acc_norm\": 0.6832295460165766,\n \"acc_norm_stderr\": 0.031667751099416844,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.01711581563241818,\n \"mc2\": 0.5565099709811991,\n \"mc2_stderr\": 0.015263307246122862\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6100682593856656,\n \"acc_stderr\": 0.01425295984889289,\n \"acc_norm\": 0.6569965870307167,\n \"acc_norm_stderr\": 0.013872423223718166\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7210714997012547,\n \"acc_stderr\": 0.004475557360359705,\n \"acc_norm\": 0.8919537940649273,\n \"acc_norm_stderr\": 0.003098043101775829\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.7986111111111112,\n \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.031639106653672915,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.031639106653672915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154957,\n \"acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154957\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.03123475237772117,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.03123475237772117\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215293,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215293\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970562,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970562\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649416,\n \"acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649416\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017016,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017016\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n \"acc_stderr\": 0.027790177064383602,\n \"acc_norm\": 0.7802690582959642,\n \"acc_norm_stderr\": 0.027790177064383602\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8625954198473282,\n \"acc_stderr\": 0.030194823996804475,\n \"acc_norm\": 0.8625954198473282,\n \"acc_norm_stderr\": 0.030194823996804475\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5803571428571429,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.5803571428571429,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8492975734355045,\n \"acc_stderr\": 0.012793420883120802,\n \"acc_norm\": 0.8492975734355045,\n \"acc_norm_stderr\": 0.012793420883120802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5921787709497207,\n \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.5921787709497207,\n \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.021751866060815864,\n \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.021751866060815864\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291467,\n \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291467\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5247718383311604,\n \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.5247718383311604,\n \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.02448448716291397,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.02448448716291397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.01711581563241818,\n \"mc2\": 0.5565099709811991,\n \"mc2_stderr\": 0.015263307246122862\n }\n}\n```", "repo_url": "https://huggingface.co/_fsx_shared-falcon-180B_platypus_15_converted_safetensors", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|arc:challenge|25_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hellaswag|10_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T03-07-15.932697.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T03-07-15.932697.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T03_07_15.932697", "path": ["results_2023-09-13T03-07-15.932697.parquet"]}, {"split": "latest", "path": ["results_2023-09-13T03-07-15.932697.parquet"]}]}]}
|
2023-09-13T02:07:40+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_platypus_15_converted_safetensors
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model _fsx_shared-falcon-180B_platypus_15_converted_safetensors on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-13T03:07:15.932697(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_platypus_15_converted_safetensors",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model _fsx_shared-falcon-180B_platypus_15_converted_safetensors on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-13T03:07:15.932697(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_platypus_15_converted_safetensors",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model _fsx_shared-falcon-180B_platypus_15_converted_safetensors on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-13T03:07:15.932697(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
36,
31,
184,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of _fsx_shared-falcon-180B_platypus_15_converted_safetensors## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model _fsx_shared-falcon-180B_platypus_15_converted_safetensors on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-13T03:07:15.932697(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
bcbaddab56488d005f3f0be138e045fa65387d49
|
# Dataset of hapu (Pokémon)
This is the dataset of hapu (Pokémon), containing 85 images and their tags.
The core tags of this character are `long_hair, black_hair, twintails, dark-skinned_female, thick_eyebrows, dark_skin, purple_eyes, bright_pupils, eyelashes, white_pupils, purple_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 85 | 68.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hapu_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 85 | 41.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hapu_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 161 | 78.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hapu_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 85 | 60.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hapu_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 161 | 109.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hapu_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hapu_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bonnet, closed_mouth, jumpsuit, black_footwear, boots, grey_eyes, looking_at_viewer, solo, full_body, smile, belt, grey_gloves, pouch, puffy_short_sleeves, standing, blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bonnet | closed_mouth | jumpsuit | black_footwear | boots | grey_eyes | looking_at_viewer | solo | full_body | smile | belt | grey_gloves | pouch | puffy_short_sleeves | standing | blush |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------------|:-----------|:-----------------|:--------|:------------|:--------------------|:-------|:------------|:--------|:-------|:--------------|:--------|:----------------------|:-----------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/hapu_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T02:22:07+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:43:51+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of hapu (Pokémon)
=========================
This is the dataset of hapu (Pokémon), containing 85 images and their tags.
The core tags of this character are 'long\_hair, black\_hair, twintails, dark-skinned\_female, thick\_eyebrows, dark\_skin, purple\_eyes, bright\_pupils, eyelashes, white\_pupils, purple\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
02ae7e681071a8b759b8682c7e5e6524c74eb57b
|
# Dataset Card for "c4_academicbiomedical_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
zxvix/c4_academicbiomedical_2
|
[
"region:us"
] |
2023-09-13T02:35:41+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "timestamp", "dtype": "timestamp[s]"}, {"name": "url", "dtype": "string"}, {"name": "original_text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2352052.0, "num_examples": 986}], "download_size": 1376270, "dataset_size": 2352052.0}}
|
2023-09-13T02:58:39+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "c4_academicbiomedical_2"
More Information needed
|
[
"# Dataset Card for \"c4_academicbiomedical_2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"c4_academicbiomedical_2\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"c4_academicbiomedical_2\"\n\nMore Information needed"
] |
71a88b8839977af2b001ce194656cc079233d251
|
# Dataset of momi (Pokémon)
This is the dataset of momi (Pokémon), containing 105 images and their tags.
The core tags of this character are `green_hair, long_hair, braid, single_braid, green_eyes, breasts, hair_over_shoulder, hair_between_eyes, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 105 | 91.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momi_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 105 | 57.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momi_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 188 | 107.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momi_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 105 | 82.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momi_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 188 | 148.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momi_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/momi_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, green_jacket, green_skirt, long_sleeves, open_mouth, :d, closed_eyes, pokemon_(creature), eyelashes, blush_stickers, boots, shirt |
| 1 | 5 |  |  |  |  |  | 1girl, closed_mouth, hand_up, white_background, green_jacket, green_skirt, long_sleeves, simple_background, smile, solo, sketch, braided_ponytail, brown_footwear, dress, looking_at_viewer, shirt |
| 2 | 5 |  |  |  |  |  | 1girl, open_mouth, smile, solo, blush, closed_eyes, green_dress, skirt |
| 3 | 6 |  |  |  |  |  | 1girl, nipples, solo, nude, blush, female_pubic_hair, pussy, spread_legs, closed_eyes, navel, sweat |
| 4 | 7 |  |  |  |  |  | 1girl, hetero, sex, 1boy, blush, open_mouth, penis, vaginal, cum_in_pussy, nipples, censored, solo_focus, sweat, girl_on_top, nude, saliva, straddling, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_jacket | green_skirt | long_sleeves | open_mouth | :d | closed_eyes | pokemon_(creature) | eyelashes | blush_stickers | boots | shirt | closed_mouth | hand_up | white_background | simple_background | smile | solo | sketch | braided_ponytail | brown_footwear | dress | looking_at_viewer | blush | green_dress | skirt | nipples | nude | female_pubic_hair | pussy | spread_legs | navel | sweat | hetero | sex | 1boy | penis | vaginal | cum_in_pussy | censored | solo_focus | girl_on_top | saliva | straddling | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------|:---------------|:-------------|:-----|:--------------|:---------------------|:------------|:-----------------|:--------|:--------|:---------------|:----------|:-------------------|:--------------------|:--------|:-------|:---------|:-------------------|:-----------------|:--------|:--------------------|:--------|:--------------|:--------|:----------|:-------|:--------------------|:--------|:--------------|:--------|:--------|:---------|:------|:-------|:--------|:----------|:---------------|:-----------|:-------------|:--------------|:---------|:-------------|:-------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | | X | | X | | | | | | | | | | X | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | | | X | | | | | | | | | | | X | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | | X | | | | | | | | | | | | | | | | | | | X | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/momi_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T02:36:00+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:47:22+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of momi (Pokémon)
=========================
This is the dataset of momi (Pokémon), containing 105 images and their tags.
The core tags of this character are 'green\_hair, long\_hair, braid, single\_braid, green\_eyes, breasts, hair\_over\_shoulder, hair\_between\_eyes, bangs, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
cbad208539fff1086afd8b128cff8fe1dcbb0441
|
# Dataset of mars (Pokémon)
This is the dataset of mars (Pokémon), containing 59 images and their tags.
The core tags of this character are `red_hair, short_hair, red_eyes, breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 59 | 45.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mars_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 59 | 30.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mars_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 106 | 54.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mars_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 59 | 42.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mars_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 106 | 68.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mars_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mars_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, hetero, blush, 1boy, penis, cum, nipples, solo_focus, open_mouth, sex, testicles, ass, medium_breasts, pussy, torn_clothes, uncensored, vaginal |
| 1 | 6 |  |  |  |  |  | 1girl, eyelashes, looking_at_viewer, cowlick, long_sleeves, pantyhose, solo, hair_between_eyes, orange_eyes, smile, dress, hands_up, open_mouth, tongue |
| 2 | 18 |  |  |  |  |  | 1girl, solo, smile, holding_poke_ball, poke_ball_(basic), looking_at_viewer, pantyhose, dress, eyelashes, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | blush | 1boy | penis | cum | nipples | solo_focus | open_mouth | sex | testicles | ass | medium_breasts | pussy | torn_clothes | uncensored | vaginal | eyelashes | looking_at_viewer | cowlick | long_sleeves | pantyhose | solo | hair_between_eyes | orange_eyes | smile | dress | hands_up | tongue | holding_poke_ball | poke_ball_(basic) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:--------|:-------|:--------|:------|:----------|:-------------|:-------------|:------|:------------|:------|:-----------------|:--------|:---------------|:-------------|:----------|:------------|:--------------------|:----------|:---------------|:------------|:-------|:--------------------|:--------------|:--------|:--------|:-----------|:---------|:--------------------|:--------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | |
| 2 | 18 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | X | X | | X | X | X | | | X | X | | | X | X |
|
CyberHarem/mars_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T02:46:39+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:41:57+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of mars (Pokémon)
=========================
This is the dataset of mars (Pokémon), containing 59 images and their tags.
The core tags of this character are 'red\_hair, short\_hair, red\_eyes, breasts, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
c54a35636bc588e6ef4148f32bd1411a12d24e26
|
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Danielbrdz/Barcenas-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-13b](https://huggingface.co/Danielbrdz/Barcenas-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Danielbrdz__Barcenas-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T10:25:35.376129](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-13b/blob/main/results_2023-10-28T10-25-35.376129.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131106,
"f1": 0.07556837248322182,
"f1_stderr": 0.001653769461736285,
"acc": 0.44339933687296285,
"acc_stderr": 0.010506321335992158
},
"harness|drop|3": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131106,
"f1": 0.07556837248322182,
"f1_stderr": 0.001653769461736285
},
"harness|gsm8k|5": {
"acc": 0.12357846853677028,
"acc_stderr": 0.009065050306776923
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207394
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Danielbrdz__Barcenas-13b
|
[
"region:us"
] |
2023-09-13T02:48:32+00:00
|
{"pretty_name": "Evaluation run of Danielbrdz/Barcenas-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-13b](https://huggingface.co/Danielbrdz/Barcenas-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Danielbrdz__Barcenas-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T10:25:35.376129](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-13b/blob/main/results_2023-10-28T10-25-35.376129.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004718959731543624,\n \"em_stderr\": 0.0007018360183131106,\n \"f1\": 0.07556837248322182,\n \"f1_stderr\": 0.001653769461736285,\n \"acc\": 0.44339933687296285,\n \"acc_stderr\": 0.010506321335992158\n },\n \"harness|drop|3\": {\n \"em\": 0.004718959731543624,\n \"em_stderr\": 0.0007018360183131106,\n \"f1\": 0.07556837248322182,\n \"f1_stderr\": 0.001653769461736285\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \"acc_stderr\": 0.009065050306776923\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207394\n }\n}\n```", "repo_url": "https://huggingface.co/Danielbrdz/Barcenas-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|arc:challenge|25_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T10_25_35.376129", "path": ["**/details_harness|drop|3_2023-10-28T10-25-35.376129.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T10-25-35.376129.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T10_25_35.376129", "path": ["**/details_harness|gsm8k|5_2023-10-28T10-25-35.376129.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T10-25-35.376129.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hellaswag|10_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T03-48-16.128379.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T03-48-16.128379.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T03-48-16.128379.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T10_25_35.376129", "path": ["**/details_harness|winogrande|5_2023-10-28T10-25-35.376129.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T10-25-35.376129.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T03_48_16.128379", "path": ["results_2023-09-13T03-48-16.128379.parquet"]}, {"split": "2023_10_28T10_25_35.376129", "path": ["results_2023-10-28T10-25-35.376129.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T10-25-35.376129.parquet"]}]}]}
|
2023-10-28T09:25:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Danielbrdz/Barcenas-13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T10:25:35.376129(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Danielbrdz/Barcenas-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Danielbrdz/Barcenas-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T10:25:35.376129(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Danielbrdz/Barcenas-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Danielbrdz/Barcenas-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T10:25:35.376129(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Danielbrdz/Barcenas-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Danielbrdz/Barcenas-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T10:25:35.376129(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
5e690337f518ffc3d00b3351643c0fc7549be54b
|
# Dataset of dracaena/ドラセナ (Pokémon)
This is the dataset of dracaena/ドラセナ (Pokémon), containing 62 images and their tags.
The core tags of this character are `long_hair, black_hair, breasts, earrings, mature_female`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 62 | 55.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 62 | 35.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 133 | 66.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 62 | 49.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 133 | 87.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dracaena_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dracaena_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, closed_eyes, necklace, open_mouth, smile, pokemon_(creature), dress, simple_background, solo |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_eyes | necklace | open_mouth | smile | pokemon_(creature) | dress | simple_background | solo |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-----------|:-------------|:--------|:---------------------|:--------|:--------------------|:-------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
CyberHarem/dracaena_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T02:57:14+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:41:18+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of dracaena/ドラセナ (Pokémon)
==================================
This is the dataset of dracaena/ドラセナ (Pokémon), containing 62 images and their tags.
The core tags of this character are 'long\_hair, black\_hair, breasts, earrings, mature\_female', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
245c2c539aa00802a13235e75b59e0fb0d6b0399
|
# Dataset of dahlia (Pokémon)
This is the dataset of dahlia (Pokémon), containing 25 images and their tags.
The core tags of this character are `black_hair, long_hair, dark_skin, breasts, dark-skinned_female, blue_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 15.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 11.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 45 | 20.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 14.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 45 | 26.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dahlia_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, smile, solo, pants, midriff, navel_piercing, denim, cleavage, crop_top |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | pants | midriff | navel_piercing | denim | cleavage | crop_top |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:----------|:-----------------|:--------|:-----------|:-----------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
CyberHarem/dahlia_pokemon
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T03:01:17+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T22:44:50+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of dahlia (Pokémon)
===========================
This is the dataset of dahlia (Pokémon), containing 25 images and their tags.
The core tags of this character are 'black\_hair, long\_hair, dark\_skin, breasts, dark-skinned\_female, blue\_eyes, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
b3f75313e7f02fbe07fed893cb8975f84282d1da
|
# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TigerResearch/tigerbot-70b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-70b-chat](https://huggingface.co/TigerResearch/tigerbot-70b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T05:20:39.857272](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat/blob/main/results_2023-10-25T05-20-39.857272.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.43791946308724833,
"em_stderr": 0.005080846199755935,
"f1": 0.47991820469798696,
"f1_stderr": 0.004915876956213108,
"acc": 0.6161274146961446,
"acc_stderr": 0.012720219505629717
},
"harness|drop|3": {
"em": 0.43791946308724833,
"em_stderr": 0.005080846199755935,
"f1": 0.47991820469798696,
"f1_stderr": 0.004915876956213108
},
"harness|gsm8k|5": {
"acc": 0.4564063684609553,
"acc_stderr": 0.013720038270485325
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774106
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat
|
[
"region:us"
] |
2023-09-13T03:03:49+00:00
|
{"pretty_name": "Evaluation run of TigerResearch/tigerbot-70b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [TigerResearch/tigerbot-70b-chat](https://huggingface.co/TigerResearch/tigerbot-70b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T05:20:39.857272](https://huggingface.co/datasets/open-llm-leaderboard/details_TigerResearch__tigerbot-70b-chat/blob/main/results_2023-10-25T05-20-39.857272.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.43791946308724833,\n \"em_stderr\": 0.005080846199755935,\n \"f1\": 0.47991820469798696,\n \"f1_stderr\": 0.004915876956213108,\n \"acc\": 0.6161274146961446,\n \"acc_stderr\": 0.012720219505629717\n },\n \"harness|drop|3\": {\n \"em\": 0.43791946308724833,\n \"em_stderr\": 0.005080846199755935,\n \"f1\": 0.47991820469798696,\n \"f1_stderr\": 0.004915876956213108\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4564063684609553,\n \"acc_stderr\": 0.013720038270485325\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774106\n }\n}\n```", "repo_url": "https://huggingface.co/TigerResearch/tigerbot-70b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|arc:challenge|25_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|arc:challenge|25_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T02_18_07.282954", "path": ["**/details_harness|drop|3_2023-10-24T02-18-07.282954.parquet"]}, {"split": "2023_10_25T05_20_39.857272", "path": ["**/details_harness|drop|3_2023-10-25T05-20-39.857272.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T05-20-39.857272.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T02_18_07.282954", "path": ["**/details_harness|gsm8k|5_2023-10-24T02-18-07.282954.parquet"]}, {"split": "2023_10_25T05_20_39.857272", "path": ["**/details_harness|gsm8k|5_2023-10-25T05-20-39.857272.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T05-20-39.857272.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hellaswag|10_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hellaswag|10_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T04-03-35.733983.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T04-21-04.931146.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T04-21-04.931146.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T04-21-04.931146.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T02_18_07.282954", "path": ["**/details_harness|winogrande|5_2023-10-24T02-18-07.282954.parquet"]}, {"split": "2023_10_25T05_20_39.857272", "path": ["**/details_harness|winogrande|5_2023-10-25T05-20-39.857272.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T05-20-39.857272.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T04_03_35.733983", "path": ["results_2023-09-13T04-03-35.733983.parquet"]}, {"split": "2023_09_13T04_21_04.931146", "path": ["results_2023-09-13T04-21-04.931146.parquet"]}, {"split": "2023_10_24T02_18_07.282954", "path": ["results_2023-10-24T02-18-07.282954.parquet"]}, {"split": "2023_10_25T05_20_39.857272", "path": ["results_2023-10-25T05-20-39.857272.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T05-20-39.857272.parquet"]}]}]}
|
2023-10-25T04:20:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-chat on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-25T05:20:39.857272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T05:20:39.857272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T05:20:39.857272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TigerResearch/tigerbot-70b-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TigerResearch/tigerbot-70b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T05:20:39.857272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
1635577e91e2cbaf913eec85fc3eee6121fba7db
|
# Dataset Card for Evaluation run of rameshm/llama-2-13b-mathgpt-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rameshm/llama-2-13b-mathgpt-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [rameshm/llama-2-13b-mathgpt-v4](https://huggingface.co/rameshm/llama-2-13b-mathgpt-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T05:40:14.287010](https://huggingface.co/datasets/open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4/blob/main/results_2023-10-25T05-40-14.287010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002202181208053691,
"em_stderr": 0.0004800510816619372,
"f1": 0.06708787751677872,
"f1_stderr": 0.0015253339046219561,
"acc": 0.41923906142571715,
"acc_stderr": 0.011369111930643223
},
"harness|drop|3": {
"em": 0.002202181208053691,
"em_stderr": 0.0004800510816619372,
"f1": 0.06708787751677872,
"f1_stderr": 0.0015253339046219561
},
"harness|gsm8k|5": {
"acc": 0.1470811220621683,
"acc_stderr": 0.009756063660359863
},
"harness|winogrande|5": {
"acc": 0.691397000789266,
"acc_stderr": 0.012982160200926584
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4
|
[
"region:us"
] |
2023-09-13T03:14:15+00:00
|
{"pretty_name": "Evaluation run of rameshm/llama-2-13b-mathgpt-v4", "dataset_summary": "Dataset automatically created during the evaluation run of model [rameshm/llama-2-13b-mathgpt-v4](https://huggingface.co/rameshm/llama-2-13b-mathgpt-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T05:40:14.287010](https://huggingface.co/datasets/open-llm-leaderboard/details_rameshm__llama-2-13b-mathgpt-v4/blob/main/results_2023-10-25T05-40-14.287010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002202181208053691,\n \"em_stderr\": 0.0004800510816619372,\n \"f1\": 0.06708787751677872,\n \"f1_stderr\": 0.0015253339046219561,\n \"acc\": 0.41923906142571715,\n \"acc_stderr\": 0.011369111930643223\n },\n \"harness|drop|3\": {\n \"em\": 0.002202181208053691,\n \"em_stderr\": 0.0004800510816619372,\n \"f1\": 0.06708787751677872,\n \"f1_stderr\": 0.0015253339046219561\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1470811220621683,\n \"acc_stderr\": 0.009756063660359863\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.691397000789266,\n \"acc_stderr\": 0.012982160200926584\n }\n}\n```", "repo_url": "https://huggingface.co/rameshm/llama-2-13b-mathgpt-v4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|arc:challenge|25_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T05_40_14.287010", "path": ["**/details_harness|drop|3_2023-10-25T05-40-14.287010.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T05-40-14.287010.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T05_40_14.287010", "path": ["**/details_harness|gsm8k|5_2023-10-25T05-40-14.287010.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T05-40-14.287010.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hellaswag|10_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T04-13-58.726542.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T04-13-58.726542.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T04-13-58.726542.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T05_40_14.287010", "path": ["**/details_harness|winogrande|5_2023-10-25T05-40-14.287010.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T05-40-14.287010.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T04_13_58.726542", "path": ["results_2023-09-13T04-13-58.726542.parquet"]}, {"split": "2023_10_25T05_40_14.287010", "path": ["results_2023-10-25T05-40-14.287010.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T05-40-14.287010.parquet"]}]}]}
|
2023-10-25T04:40:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of rameshm/llama-2-13b-mathgpt-v4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model rameshm/llama-2-13b-mathgpt-v4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-25T05:40:14.287010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of rameshm/llama-2-13b-mathgpt-v4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rameshm/llama-2-13b-mathgpt-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T05:40:14.287010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rameshm/llama-2-13b-mathgpt-v4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model rameshm/llama-2-13b-mathgpt-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T05:40:14.287010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of rameshm/llama-2-13b-mathgpt-v4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model rameshm/llama-2-13b-mathgpt-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T05:40:14.287010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f598e6ea5c871f9b557d11b07d8d4248f127ea08
|
# LPFF: Large-Pose-Flickr-Faces Dataset
**LPFF is a large-pose Flickr face dataset comprised of 19,590 high-quality real large-pose portrait images.**
> **[ICCV 2023] LPFF: A Portrait Dataset for Face Generators Across Large Poses**
>
> [Yiqian Wu](https://onethousandwu.com/), Jing Zhang, [Hongbo Fu](http://sweb.cityu.edu.hk/hongbofu/publications.html), [Xiaogang Jin*](http://www.cad.zju.edu.cn/home/jin)
[Paper](https://arxiv.org/abs/2303.14407) [Video](http://www.cad.zju.edu.cn/home/jin/iccv2023/demo.mp4) [Suppl](https://drive.google.com/file/d/1Xktg7oqMMNN9hqGYva3BBTJoux17y2SR/view?usp=sharing) [Project Page](http://www.cad.zju.edu.cn/home/jin/iccv2023/iccv2023.htm)
The creation of 2D realistic facial images and 3D face shapes using generative networks has been a hot topic in recent years. Existing face generators exhibit exceptional performance on faces in small to medium poses (with respect to frontal faces), but struggle to produce realistic results for large poses. The distorted rendering results on large poses in 3D-aware generators further show that the generated 3D face shapes are far from the distribution of 3D faces in reality. We find that the above issues are caused by the training dataset's posture imbalance.
In this paper, we present **LPFF**, a large-pose Flickr face dataset comprised of 19,590 high-quality real large-pose portrait images. We utilize our dataset to train a 2D face generator that can process large-pose face images, as well as a 3D-aware generator that can generate realistic human face geometry. To better validate our pose-conditional 3D-aware generators, we develop a new FID measure to evaluate the 3D-level performance. Through this novel FID measure and other experiments, we show that LPFF can help 2D face generators extend their latent space and better manipulate the large-pose data, and help 3D-aware face generators achieve better view consistency and more realistic 3D reconstruction results.
### Available sources
Notice: We have uploaded all the data using OneDrive, and the shared link should be updated every two months. If you find that the link is not working, please contact us to update it.
| | Description |
| ------------------------------------------------------------ | ------------------------------------------------------------ |
| [dataset](https://github.com/oneThousand1000/LPFF-dataset/tree/master/dataset_download) | Dataset download. |
| [data_processing](https://github.com/oneThousand1000/LPFF-dataset/tree/master/data_processing) | Data processing codes and data download links. Including image alignment, camera parameters extraction, and dataset rebalance. |
| [training](https://github.com/oneThousand1000/LPFF-dataset/tree/master/training) | Model training and FID computation guidance. |
| [networks](https://github.com/oneThousand1000/LPFF-dataset/tree/master/networks) | Pretrained StyleGAN2-ada and EG3D models trained on the LPFF+FFHQ dataset. |
### Contact
[[email protected]](mailto:[email protected]) / [[email protected]](mailto:[email protected])
### Citation
If you find this project helpful to your research, please consider citing:
```
@inproceedings{wu2023iccvlpff,
author = {Yiqian Wu and Jing Zhang and Hongbo Fu and Xiaogang Jin},
title = {LPFF: A Portrait Dataset for Face Generators Across Large Poses},
booktitle = {2023 {IEEE/CVF} International Conference on Computer Vision, {ICCV}, France, October 2-3, 2023},
publisher = {{IEEE}},
year = {2023},
}
```
|
onethousand/LPFF
|
[
"license:cc-by-nc-2.0",
"arxiv:2303.14407",
"region:us"
] |
2023-09-13T03:19:43+00:00
|
{"license": "cc-by-nc-2.0"}
|
2023-11-04T08:51:51+00:00
|
[
"2303.14407"
] |
[] |
TAGS
#license-cc-by-nc-2.0 #arxiv-2303.14407 #region-us
|
LPFF: Large-Pose-Flickr-Faces Dataset
=====================================
LPFF is a large-pose Flickr face dataset comprised of 19,590 high-quality real large-pose portrait images.
>
> [ICCV 2023] LPFF: A Portrait Dataset for Face Generators Across Large Poses
>
>
> Yiqian Wu, Jing Zhang, Hongbo Fu, Xiaogang Jin\*
>
>
>
Paper Video Suppl Project Page
The creation of 2D realistic facial images and 3D face shapes using generative networks has been a hot topic in recent years. Existing face generators exhibit exceptional performance on faces in small to medium poses (with respect to frontal faces), but struggle to produce realistic results for large poses. The distorted rendering results on large poses in 3D-aware generators further show that the generated 3D face shapes are far from the distribution of 3D faces in reality. We find that the above issues are caused by the training dataset's posture imbalance.
In this paper, we present LPFF, a large-pose Flickr face dataset comprised of 19,590 high-quality real large-pose portrait images. We utilize our dataset to train a 2D face generator that can process large-pose face images, as well as a 3D-aware generator that can generate realistic human face geometry. To better validate our pose-conditional 3D-aware generators, we develop a new FID measure to evaluate the 3D-level performance. Through this novel FID measure and other experiments, we show that LPFF can help 2D face generators extend their latent space and better manipulate the large-pose data, and help 3D-aware face generators achieve better view consistency and more realistic 3D reconstruction results.
### Available sources
Notice: We have uploaded all the data using OneDrive, and the shared link should be updated every two months. If you find that the link is not working, please contact us to update it.
### Contact
onethousand@URL / onethousand1250@URL
If you find this project helpful to your research, please consider citing:
|
[
"### Available sources\n\n\nNotice: We have uploaded all the data using OneDrive, and the shared link should be updated every two months. If you find that the link is not working, please contact us to update it.",
"### Contact\n\n\nonethousand@URL / onethousand1250@URL\n\n\nIf you find this project helpful to your research, please consider citing:"
] |
[
"TAGS\n#license-cc-by-nc-2.0 #arxiv-2303.14407 #region-us \n",
"### Available sources\n\n\nNotice: We have uploaded all the data using OneDrive, and the shared link should be updated every two months. If you find that the link is not working, please contact us to update it.",
"### Contact\n\n\nonethousand@URL / onethousand1250@URL\n\n\nIf you find this project helpful to your research, please consider citing:"
] |
[
25,
45,
33
] |
[
"passage: TAGS\n#license-cc-by-nc-2.0 #arxiv-2303.14407 #region-us \n### Available sources\n\n\nNotice: We have uploaded all the data using OneDrive, and the shared link should be updated every two months. If you find that the link is not working, please contact us to update it.### Contact\n\n\nonethousand@URL / onethousand1250@URL\n\n\nIf you find this project helpful to your research, please consider citing:"
] |
3400813a511572fed9bb18addea6dbb2513dd2de
|
# Dataset of houjou_karen/北条加蓮 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of houjou_karen/北条加蓮 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_hair, brown_eyes, long_hair, breasts, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 696.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjou_karen_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 421.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjou_karen_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1216 | 882.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjou_karen_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 626.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houjou_karen_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1216 | 1.18 GiB | [Download](https://huggingface.co/datasets/CyberHarem/houjou_karen_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/houjou_karen_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, smile, solo, blush, looking_at_viewer, open_mouth, microphone, earrings |
| 1 | 5 |  |  |  |  |  | 1girl, open_mouth, smile, solo, blush, looking_at_viewer, pantyhose, dress, one_eye_closed, scarf |
| 2 | 10 |  |  |  |  |  | dress, 1girl, blush, solo, bare_shoulders, looking_at_viewer, open_mouth, :d, earrings, choker, elbow_gloves, hair_flower |
| 3 | 11 |  |  |  |  |  | 1girl, blush, solo, school_uniform, smile, looking_at_viewer, necklace, skirt, twintails, cardigan, bag, drill_hair, open_mouth |
| 4 | 5 |  |  |  |  |  | blush, hair_flower, looking_at_viewer, 1girl, blue_bikini, cleavage, collarbone, frilled_bikini, large_breasts, navel, open_mouth, orange_hair, solo, :d, necklace, short_hair, side-tie_bikini_bottom, yellow_eyes |
| 5 | 10 |  |  |  |  |  | 1girl, blue_sky, cleavage, collarbone, day, solo, blush, cloud, looking_at_viewer, outdoors, floral_print, navel, white_bikini, open_mouth, necklace, ocean, side-tie_bikini_bottom, :d, water, arm_up, hair_flower, orange_hair, wading |
| 6 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, nipples, female_pubic_hair, solo, smile, completely_nude, pussy, simple_background, white_background, collarbone, large_breasts |
| 7 | 5 |  |  |  |  |  | 1girl, blush, cloud, looking_at_viewer, outdoors, sky, solo, straw_hat, white_dress, day, smile, bare_shoulders, ocean, open_mouth, sundress, water, collarbone, flower, sun_hat, twintails, wet_clothes, wind_lift |
| 8 | 6 |  |  |  |  |  | 1girl, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, wrist_cuffs, black_leotard, blush, cleavage, detached_collar, smile, solo, strapless_leotard, bare_shoulders, black_bowtie, ass, black_pantyhose, closed_mouth, rabbit_tail |
| 9 | 5 |  |  |  |  |  | 1boy, 1girl, blush, girl_on_top, hetero, open_mouth, sex, vaginal, cowgirl_position, looking_at_viewer, navel, penis, pov, solo_focus, sweat, female_pubic_hair, large_breasts, nipples, orange_hair, bar_censor, bikini_bottom_aside, cum_in_pussy, heart, holding_hands, interlocked_fingers, smile, spread_legs, yellow_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | blush | looking_at_viewer | open_mouth | microphone | earrings | pantyhose | dress | one_eye_closed | scarf | bare_shoulders | :d | choker | elbow_gloves | hair_flower | school_uniform | necklace | skirt | twintails | cardigan | bag | drill_hair | blue_bikini | cleavage | collarbone | frilled_bikini | large_breasts | navel | orange_hair | short_hair | side-tie_bikini_bottom | yellow_eyes | blue_sky | day | cloud | outdoors | floral_print | white_bikini | ocean | water | arm_up | wading | nipples | female_pubic_hair | completely_nude | pussy | simple_background | white_background | sky | straw_hat | white_dress | sundress | flower | sun_hat | wet_clothes | wind_lift | fake_animal_ears | playboy_bunny | rabbit_ears | wrist_cuffs | black_leotard | detached_collar | strapless_leotard | black_bowtie | ass | black_pantyhose | closed_mouth | rabbit_tail | 1boy | girl_on_top | hetero | sex | vaginal | cowgirl_position | penis | pov | solo_focus | sweat | bar_censor | bikini_bottom_aside | cum_in_pussy | heart | holding_hands | interlocked_fingers | spread_legs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:--------------------|:-------------|:-------------|:-----------|:------------|:--------|:-----------------|:--------|:-----------------|:-----|:---------|:---------------|:--------------|:-----------------|:-----------|:--------|:------------|:-----------|:------|:-------------|:--------------|:-----------|:-------------|:-----------------|:----------------|:--------|:--------------|:-------------|:-------------------------|:--------------|:-----------|:------|:--------|:-----------|:---------------|:---------------|:--------|:--------|:---------|:---------|:----------|:--------------------|:------------------|:--------|:--------------------|:-------------------|:------|:------------|:--------------|:-----------|:---------|:----------|:--------------|:------------|:-------------------|:----------------|:--------------|:--------------|:----------------|:------------------|:--------------------|:---------------|:------|:------------------|:---------------|:--------------|:-------|:--------------|:---------|:------|:----------|:-------------------|:--------|:------|:-------------|:--------|:-------------|:----------------------|:---------------|:--------|:----------------|:----------------------|:--------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | X | X | X | X | | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | X | X | | | | | | | | X | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | | X | X | X | X | | | | | | | | X | | | X | | X | | | | | | | X | X | | | X | X | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | X | | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | X | | | | | | | | X | | | | | | X | | | | | | | | | X | X | X | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/houjou_karen_idolmastercinderellagirls
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T03:21:05+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T13:47:28+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of houjou\_karen/北条加蓮 (THE iDOLM@STER: Cinderella Girls)
================================================================
This is the dataset of houjou\_karen/北条加蓮 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are 'brown\_hair, brown\_eyes, long\_hair, breasts, bangs, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
2a7d9138c9999757e89e63512a3cf1808e1d507f
|
# 中文语音数据集
- 刘海柱
- 林黛玉
- 甜小喵
- 蔡徐坤
- 郭德纲
|
hello2mao/Chinese_Audio_Resource
|
[
"task_categories:text-to-speech",
"task_categories:audio-classification",
"task_categories:audio-to-audio",
"language:zh",
"license:openrail",
"region:us"
] |
2023-09-13T03:36:38+00:00
|
{"language": ["zh"], "license": "openrail", "task_categories": ["text-to-speech", "audio-classification", "audio-to-audio"]}
|
2023-09-13T04:21:30+00:00
|
[] |
[
"zh"
] |
TAGS
#task_categories-text-to-speech #task_categories-audio-classification #task_categories-audio-to-audio #language-Chinese #license-openrail #region-us
|
# 中文语音数据集
- 刘海柱
- 林黛玉
- 甜小喵
- 蔡徐坤
- 郭德纲
|
[
"# 中文语音数据集\n\n- 刘海柱\n- 林黛玉\n- 甜小喵\n- 蔡徐坤\n- 郭德纲"
] |
[
"TAGS\n#task_categories-text-to-speech #task_categories-audio-classification #task_categories-audio-to-audio #language-Chinese #license-openrail #region-us \n",
"# 中文语音数据集\n\n- 刘海柱\n- 林黛玉\n- 甜小喵\n- 蔡徐坤\n- 郭德纲"
] |
[
56,
31
] |
[
"passage: TAGS\n#task_categories-text-to-speech #task_categories-audio-classification #task_categories-audio-to-audio #language-Chinese #license-openrail #region-us \n# 中文语音数据集\n\n- 刘海柱\n- 林黛玉\n- 甜小喵\n- 蔡徐坤\n- 郭德纲"
] |
bd063f43999b17462cd702af76108c2669b4b593
|
# Dataset Card for "autotree_pmlb_100000_letter_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
yzhuang/autotree_pmlb_100000_letter_sgosdt_l256_dim10_d3_sd0
|
[
"region:us"
] |
2023-09-13T03:42:10+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "input_x", "sequence": {"sequence": "float32"}}, {"name": "input_y", "sequence": {"sequence": "float32"}}, {"name": "input_y_clean", "sequence": {"sequence": "float32"}}, {"name": "rtg", "sequence": "float64"}, {"name": "status", "sequence": {"sequence": "float32"}}, {"name": "split_threshold", "sequence": {"sequence": "float32"}}, {"name": "split_dimension", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 7061642624, "num_examples": 100000}, {"name": "validation", "num_bytes": 709192128, "num_examples": 10000}], "download_size": 300590767, "dataset_size": 7770834752}}
|
2023-09-13T03:43:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "autotree_pmlb_100000_letter_sgosdt_l256_dim10_d3_sd0"
More Information needed
|
[
"# Dataset Card for \"autotree_pmlb_100000_letter_sgosdt_l256_dim10_d3_sd0\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"autotree_pmlb_100000_letter_sgosdt_l256_dim10_d3_sd0\"\n\nMore Information needed"
] |
[
6,
36
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"autotree_pmlb_100000_letter_sgosdt_l256_dim10_d3_sd0\"\n\nMore Information needed"
] |
fd7c93e94f568f8b41ccf980df2bc4fa8f0a4eaa
|
# Dataset Card for Evaluation run of vihangd/smartyplats-3b-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/vihangd/smartyplats-3b-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [vihangd/smartyplats-3b-v1](https://huggingface.co/vihangd/smartyplats-3b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__smartyplats-3b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T05:25:12.646031](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartyplats-3b-v1/blob/main/results_2023-10-23T05-25-12.646031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346039121,
"f1": 0.054003775167785366,
"f1_stderr": 0.0013390559797939118,
"acc": 0.33403633256401344,
"acc_stderr": 0.008080098450731814
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346039121,
"f1": 0.054003775167785366,
"f1_stderr": 0.0013390559797939118
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.002822713322387704
},
"harness|winogrande|5": {
"acc": 0.6574585635359116,
"acc_stderr": 0.013337483579075925
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_vihangd__smartyplats-3b-v1
|
[
"region:us"
] |
2023-09-13T03:45:59+00:00
|
{"pretty_name": "Evaluation run of vihangd/smartyplats-3b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [vihangd/smartyplats-3b-v1](https://huggingface.co/vihangd/smartyplats-3b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__smartyplats-3b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T05:25:12.646031](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartyplats-3b-v1/blob/main/results_2023-10-23T05-25-12.646031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346039121,\n \"f1\": 0.054003775167785366,\n \"f1_stderr\": 0.0013390559797939118,\n \"acc\": 0.33403633256401344,\n \"acc_stderr\": 0.008080098450731814\n },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346039121,\n \"f1\": 0.054003775167785366,\n \"f1_stderr\": 0.0013390559797939118\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.002822713322387704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6574585635359116,\n \"acc_stderr\": 0.013337483579075925\n }\n}\n```", "repo_url": "https://huggingface.co/vihangd/smartyplats-3b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|arc:challenge|25_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T05_25_12.646031", "path": ["**/details_harness|drop|3_2023-10-23T05-25-12.646031.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T05-25-12.646031.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T05_25_12.646031", "path": ["**/details_harness|gsm8k|5_2023-10-23T05-25-12.646031.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T05-25-12.646031.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hellaswag|10_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T04-45-46.348158.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T04-45-46.348158.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T04-45-46.348158.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T05_25_12.646031", "path": ["**/details_harness|winogrande|5_2023-10-23T05-25-12.646031.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T05-25-12.646031.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T04_45_46.348158", "path": ["results_2023-09-13T04-45-46.348158.parquet"]}, {"split": "2023_10_23T05_25_12.646031", "path": ["results_2023-10-23T05-25-12.646031.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T05-25-12.646031.parquet"]}]}]}
|
2023-10-23T04:25:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of vihangd/smartyplats-3b-v1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model vihangd/smartyplats-3b-v1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T05:25:12.646031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of vihangd/smartyplats-3b-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vihangd/smartyplats-3b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T05:25:12.646031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vihangd/smartyplats-3b-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model vihangd/smartyplats-3b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T05:25:12.646031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vihangd/smartyplats-3b-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model vihangd/smartyplats-3b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T05:25:12.646031(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
708fb2a5ef8c4d0b4cd4f1a2022a3e5a8722aa1c
|
# Dataset Card for Evaluation run of DevaMalla/llama_7b_lora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama_7b_lora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_lora](https://huggingface.co/DevaMalla/llama_7b_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_lora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T14:26:37.860045](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_lora/blob/main/results_2023-10-26T14-26-37.860045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118953,
"f1": 0.0611650587248323,
"f1_stderr": 0.0013990352489173911,
"acc": 0.3915240971461363,
"acc_stderr": 0.00940445989381676
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118953,
"f1": 0.0611650587248323,
"f1_stderr": 0.0013990352489173911
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.0062982217961795855
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_DevaMalla__llama_7b_lora
|
[
"region:us"
] |
2023-09-13T04:07:51+00:00
|
{"pretty_name": "Evaluation run of DevaMalla/llama_7b_lora", "dataset_summary": "Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_lora](https://huggingface.co/DevaMalla/llama_7b_lora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_lora\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-26T14:26:37.860045](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_lora/blob/main/results_2023-10-26T14-26-37.860045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893118953,\n \"f1\": 0.0611650587248323,\n \"f1_stderr\": 0.0013990352489173911,\n \"acc\": 0.3915240971461363,\n \"acc_stderr\": 0.00940445989381676\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893118953,\n \"f1\": 0.0611650587248323,\n \"f1_stderr\": 0.0013990352489173911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \"acc_stderr\": 0.0062982217961795855\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453934\n }\n}\n```", "repo_url": "https://huggingface.co/DevaMalla/llama_7b_lora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|arc:challenge|25_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_26T14_26_37.860045", "path": ["**/details_harness|drop|3_2023-10-26T14-26-37.860045.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-26T14-26-37.860045.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_26T14_26_37.860045", "path": ["**/details_harness|gsm8k|5_2023-10-26T14-26-37.860045.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-26T14-26-37.860045.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hellaswag|10_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T05-07-37.970407.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T05-07-37.970407.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_26T14_26_37.860045", "path": ["**/details_harness|winogrande|5_2023-10-26T14-26-37.860045.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-26T14-26-37.860045.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T05_07_37.970407", "path": ["results_2023-09-13T05-07-37.970407.parquet"]}, {"split": "2023_10_26T14_26_37.860045", "path": ["results_2023-10-26T14-26-37.860045.parquet"]}, {"split": "latest", "path": ["results_2023-10-26T14-26-37.860045.parquet"]}]}]}
|
2023-10-26T13:26:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of DevaMalla/llama_7b_lora
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model DevaMalla/llama_7b_lora on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-26T14:26:37.860045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of DevaMalla/llama_7b_lora",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-26T14:26:37.860045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DevaMalla/llama_7b_lora",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-26T14:26:37.860045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DevaMalla/llama_7b_lora## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_lora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-26T14:26:37.860045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
33bafcdc137321cb3d9cc6c2500462a12ddd9758
|
# Dataset Card for "evol-instruct_dolly2.0_h2oGPT-falcon-40B-oasst1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sachith-surge/evol-instruct_dolly2.0_h2oGPT-falcon-40B-oasst1
|
[
"region:us"
] |
2023-09-13T04:43:22+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "evolution_strategy", "dtype": "string"}, {"name": "in-depth-evolving_operation", "dtype": "string"}, {"name": "epoch", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3051568, "num_examples": 2304}], "download_size": 1665250, "dataset_size": 3051568}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-13T04:43:25+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "evol-instruct_dolly2.0_h2oGPT-falcon-40B-oasst1"
More Information needed
|
[
"# Dataset Card for \"evol-instruct_dolly2.0_h2oGPT-falcon-40B-oasst1\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"evol-instruct_dolly2.0_h2oGPT-falcon-40B-oasst1\"\n\nMore Information needed"
] |
[
6,
35
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"evol-instruct_dolly2.0_h2oGPT-falcon-40B-oasst1\"\n\nMore Information needed"
] |
d4e2bf13f39365d38b8ec432c283f66fa1c3cb0a
|
# Dataset Card for "sales4-formatted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
pssubitha/sales4-formatted
|
[
"region:us"
] |
2023-09-13T05:06:01+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 46461, "num_examples": 120}], "download_size": 24850, "dataset_size": 46461}}
|
2023-09-13T08:20:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "sales4-formatted"
More Information needed
|
[
"# Dataset Card for \"sales4-formatted\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"sales4-formatted\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"sales4-formatted\"\n\nMore Information needed"
] |
798a4ab424fc623da4b3c3ebb3e11c2ab53018fe
|
### OpenAI微调医学数据集
开源数据集存在与预训练模型数据集相似度较高的问题,导致微调结果不明显,效果不突出,因此就需要构建某种风格的专属数据集。
因此,构建了带**金庸创作风格**的医学数据集,可直接用于openai的微调使用。
微调实验结果证明,当输入医学问题时,微调模型回答的风格为**金庸创作风格**回答。
### Question
```
[{'role': 'system', 'content': '你是我的私人医生助手,你要回答我的健康问题。'}, {'role': 'user', 'content': '我睡眠不足,很头痛'}]
```
### Answer
```
少侠保重身体。江湖人称“睡眠不足,头痛难受”,此乃常见之症。吾乃医术高强之侠士,愿为你指点迷津。睡眠不足,多因忧虑纷扰、劳累过度所致。当以调整生活作息,保持规律之睡眠时间为要。且可饮用一些花草茶,如菊花、薰衣草,以宁心安神。如症状持续,还需请教名医,以寻得更为妥善之疗法。
```
可以发现,模型的回答已经学习到**金庸创作风格**的能力。
|
conghao/gpt3.5-jinyong-style
|
[
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:zh",
"medical",
"region:us"
] |
2023-09-13T05:13:59+00:00
|
{"language": ["zh"], "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"], "tags": ["medical"]}
|
2023-09-15T02:17:19+00:00
|
[] |
[
"zh"
] |
TAGS
#task_categories-question-answering #size_categories-1K<n<10K #language-Chinese #medical #region-us
|
### OpenAI微调医学数据集
开源数据集存在与预训练模型数据集相似度较高的问题,导致微调结果不明显,效果不突出,因此就需要构建某种风格的专属数据集。
因此,构建了带金庸创作风格的医学数据集,可直接用于openai的微调使用。
微调实验结果证明,当输入医学问题时,微调模型回答的风格为金庸创作风格回答。
### Question
### Answer
可以发现,模型的回答已经学习到金庸创作风格的能力。
|
[
"### OpenAI微调医学数据集\n\n开源数据集存在与预训练模型数据集相似度较高的问题,导致微调结果不明显,效果不突出,因此就需要构建某种风格的专属数据集。\n因此,构建了带金庸创作风格的医学数据集,可直接用于openai的微调使用。\n微调实验结果证明,当输入医学问题时,微调模型回答的风格为金庸创作风格回答。",
"### Question",
"### Answer\n\n\n\n可以发现,模型的回答已经学习到金庸创作风格的能力。"
] |
[
"TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-Chinese #medical #region-us \n",
"### OpenAI微调医学数据集\n\n开源数据集存在与预训练模型数据集相似度较高的问题,导致微调结果不明显,效果不突出,因此就需要构建某种风格的专属数据集。\n因此,构建了带金庸创作风格的医学数据集,可直接用于openai的微调使用。\n微调实验结果证明,当输入医学问题时,微调模型回答的风格为金庸创作风格回答。",
"### Question",
"### Answer\n\n\n\n可以发现,模型的回答已经学习到金庸创作风格的能力。"
] |
[
38,
99,
3,
19
] |
[
"passage: TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-Chinese #medical #region-us \n### OpenAI微调医学数据集\n\n开源数据集存在与预训练模型数据集相似度较高的问题,导致微调结果不明显,效果不突出,因此就需要构建某种风格的专属数据集。\n因此,构建了带金庸创作风格的医学数据集,可直接用于openai的微调使用。\n微调实验结果证明,当输入医学问题时,微调模型回答的风格为金庸创作风格回答。### Question### Answer\n\n\n\n可以发现,模型的回答已经学习到金庸创作风格的能力。"
] |
8c2a95860d7c65ca2c531a189eac8ad57ddc5594
|
# Dataset Card for "filtered-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
pharaouk/filtered-1
|
[
"region:us"
] |
2023-09-13T05:16:08+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "conversation_id", "dtype": "int64"}, {"name": "dataset_id", "dtype": "string"}, {"name": "unique_conversation_id", "dtype": "string"}, {"name": "embedding", "sequence": "float32"}, {"name": "inst_prob", "dtype": "float64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 14851302435, "num_examples": 2506367}], "download_size": 8200041049, "dataset_size": 14851302435}}
|
2023-09-13T07:35:32+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "filtered-1"
More Information needed
|
[
"# Dataset Card for \"filtered-1\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"filtered-1\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"filtered-1\"\n\nMore Information needed"
] |
f90bfb8619a90fdfc943715909429ffc25322814
|
# Dataset of shiomi_shuuko/塩見周子/시오미슈코 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of shiomi_shuuko/塩見周子/시오미슈코 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `short_hair, grey_hair, hair_between_eyes, breasts, black_eyes, earrings, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 680.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiomi_shuuko_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 384.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiomi_shuuko_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1191 | 808.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiomi_shuuko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 596.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiomi_shuuko_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1191 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shiomi_shuuko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shiomi_shuuko_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, collarbone, smile, upper_body, blush, necklace, simple_background, cleavage, white_background, bare_shoulders, off_shoulder, closed_mouth, dress, white_shirt |
| 1 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, cleavage, collarbone, navel, smile, black_bikini, jewelry, simple_background, sitting, white_background, large_breasts, o-ring |
| 2 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, side-tie_bikini_bottom, simple_background, solo, white_background, black_bikini, navel, smile, cleavage, micro_bikini, bare_shoulders, black_gloves, black_thighhighs, collarbone, elbow_gloves, black_choker, jewelry, large_breasts, open_mouth, thighs |
| 3 | 8 |  |  |  |  |  | 1girl, cleavage, collarbone, denim_shorts, hair_flower, looking_at_viewer, navel, solo, striped_bikini, short_shorts, smile, belt, blush, cutoffs, necklace, bare_shoulders, choker, layered_bikini, outdoors, bikini_top_only, blue_sky, brown_eyes, cloud, cowboy_shot, day, front-tie_bikini_top, ocean, crown_braid, flower_bracelet, leaning_forward, standing, water, yellow_flower |
| 4 | 5 |  |  |  |  |  | 1girl, belt, cleavage, collarbone, cowboy_shot, cutoffs, denim_shorts, front-tie_bikini_top, hair_flower, looking_at_viewer, navel, short_shorts, solo, striped_bikini, bare_shoulders, bikini_top_only, blush, choker, highleg_bikini, layered_bikini, necklace, white_background, :d, blonde_hair, crown_braid, o-ring, open_mouth, simple_background, arm_up, armpits, arms_behind_back, bikini_under_clothes, bracelet, chain, upper_teeth_only |
| 5 | 6 |  |  |  |  |  | 1girl, hair_ornament, looking_at_viewer, solo, blue_dress, flower, open_mouth, bracelet, sleeveless_dress, :d, blonde_hair, blush, butterfly, high_heels, night_sky |
| 6 | 13 |  |  |  |  |  | looking_at_viewer, open_cardigan, 1girl, bare_shoulders, blush, camisole, cleavage, smile, solo, collarbone, off_shoulder, short_shorts, denim_shorts, necklace, simple_background, long_sleeves, white_background, midriff, lying, navel, parted_lips |
| 7 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, detached_sleeves, hair_flower, jewelry, obi, folding_fan, wide_sleeves, bare_shoulders, blush, brown_eyes, floral_print, cleavage, holding_fan, long_sleeves, petals, cherry_blossoms, short_kimono, white_background |
| 8 | 9 |  |  |  |  |  | 1girl, fox_ears, fox_tail, kimono, looking_at_viewer, solo, wide_sleeves, blush, extra_ears, fox_mask, hair_flower, jewelry, smile, bare_shoulders, cleavage, fox_shadow_puppet, jingle_bell, detached_sleeves, obi, tabi |
| 9 | 6 |  |  |  |  |  | 1girl, detached_sleeves, extra_ears, fox_ears, fox_tail, solo, fox_mask, fox_shadow_puppet, looking_at_viewer, blush, jewelry, smile, blonde_hair, japanese_clothes, nail_polish, navel |
| 10 | 9 |  |  |  |  |  | 1girl, smile, solo, fingerless_gloves, looking_at_viewer, bracelet, garter_straps, mini_hat, short_sleeves, thighhighs, black_gloves, folding_fan, holding_fan, skirt, vertical_stripes |
| 11 | 5 |  |  |  |  |  | 2girls, blush, solo_focus, smile, blonde_hair, collarbone, large_breasts, looking_at_viewer, nipples, completely_nude, jewelry, lying, navel |
| 12 | 11 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, solo_focus, penis, sex, large_breasts, looking_at_viewer, spread_legs, vaginal, missionary, navel, on_back, open_mouth, sweat, pillow, pov, pussy, collarbone, completely_nude, cum, jewelry, male_pubic_hair, mosaic_censoring, on_bed |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | collarbone | smile | upper_body | blush | necklace | simple_background | cleavage | white_background | bare_shoulders | off_shoulder | closed_mouth | dress | white_shirt | navel | black_bikini | jewelry | sitting | large_breasts | o-ring | side-tie_bikini_bottom | micro_bikini | black_gloves | black_thighhighs | elbow_gloves | black_choker | open_mouth | thighs | denim_shorts | hair_flower | striped_bikini | short_shorts | belt | cutoffs | choker | layered_bikini | outdoors | bikini_top_only | blue_sky | brown_eyes | cloud | cowboy_shot | day | front-tie_bikini_top | ocean | crown_braid | flower_bracelet | leaning_forward | standing | water | yellow_flower | highleg_bikini | :d | blonde_hair | arm_up | armpits | arms_behind_back | bikini_under_clothes | bracelet | chain | upper_teeth_only | hair_ornament | blue_dress | flower | sleeveless_dress | butterfly | high_heels | night_sky | open_cardigan | camisole | long_sleeves | midriff | lying | parted_lips | detached_sleeves | obi | folding_fan | wide_sleeves | floral_print | holding_fan | petals | cherry_blossoms | short_kimono | fox_ears | fox_tail | kimono | extra_ears | fox_mask | fox_shadow_puppet | jingle_bell | tabi | japanese_clothes | nail_polish | fingerless_gloves | garter_straps | mini_hat | short_sleeves | thighhighs | skirt | vertical_stripes | 2girls | solo_focus | nipples | completely_nude | 1boy | hetero | penis | sex | spread_legs | vaginal | missionary | on_back | sweat | pillow | pov | pussy | cum | male_pubic_hair | mosaic_censoring | on_bed |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------|:-------|:-------------|:--------|:-------------|:--------|:-----------|:--------------------|:-----------|:-------------------|:-----------------|:---------------|:---------------|:--------|:--------------|:--------|:---------------|:----------|:----------|:----------------|:---------|:-------------------------|:---------------|:---------------|:-------------------|:---------------|:---------------|:-------------|:---------|:---------------|:--------------|:-----------------|:---------------|:-------|:----------|:---------|:-----------------|:-----------|:------------------|:-----------|:-------------|:--------|:--------------|:------|:-----------------------|:--------|:--------------|:------------------|:------------------|:-----------|:--------|:----------------|:-----------------|:-----|:--------------|:---------|:----------|:-------------------|:-----------------------|:-----------|:--------|:-------------------|:----------------|:-------------|:---------|:-------------------|:------------|:-------------|:------------|:----------------|:-----------|:---------------|:----------|:--------|:--------------|:-------------------|:------|:--------------|:---------------|:---------------|:--------------|:---------|:------------------|:---------------|:-----------|:-----------|:---------|:-------------|:-----------|:--------------------|:--------------|:-------|:-------------------|:--------------|:--------------------|:----------------|:-----------|:----------------|:-------------|:--------|:-------------------|:---------|:-------------|:----------|:------------------|:-------|:---------|:--------|:------|:--------------|:----------|:-------------|:----------|:--------|:---------|:------|:--------|:------|:------------------|:-------------------|:---------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | X | X | | | | | X | X | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | X | | X | X | | X | | X | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | X | X | | | | | X | | | | | X | | | | | | | X | | X | X | X | X | X | X | X | X | | X | | | | X | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 13 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | X | | | | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 13 |  |  |  |  |  | X | X | X | | X | | X | | | X | X | X | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | X | X | | X | | X | | | X | | X | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | X | X | | X | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 9 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | | X | | X | X | | X | | | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | |
| 12 | 11 |  |  |  |  |  | X | X | | X | | | X | | | | | | | | | | X | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/shiomi_shuuko_idolmastercinderellagirls
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T05:38:58+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T13:31:26+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of shiomi\_shuuko/塩見周子/시오미슈코 (THE iDOLM@STER: Cinderella Girls)
=======================================================================
This is the dataset of shiomi\_shuuko/塩見周子/시오미슈코 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are 'short\_hair, grey\_hair, hair\_between\_eyes, breasts, black\_eyes, earrings, bangs, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
4cecdd276e8cc39704efa09cc6b17f37a8d945b1
|
[Ping! 2](https://www.youtube.com/watch?v=8CYy9jNmpXM)
|
KaraKaraWitch/Discordian
|
[
"region:us"
] |
2023-09-13T05:41:31+00:00
|
{}
|
2023-09-13T06:28:20+00:00
|
[] |
[] |
TAGS
#region-us
|
Ping! 2
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
de2c50501b7f37cb2318a9c76e46f759aeecb37b
|
Processed [MIT-Adobe5k](https://data.csail.mit.edu/graphics/fivek/) datasets for RSFNet
Paper: https://arxiv.org/abs/2303.08682
Code: https://github.com/Vicky0522/RSFNet
If our work is helpful for your research, please consider citing:
```
@article{oywq2023rsfnet,
title={RSFNet: A white-Box image retouching approach using region-specific color filters},
author={Wenqi Ouyang and Yi Dong and Xiaoyang Kang and Peiran Ren and Xin Xu and Xuansong Xie},
journal={https://arxiv.org/abs/2303.08682},
year={2023}
}
```
|
Vicky0522/MIT-Adobe5k-for-RSFNet
|
[
"arxiv:2303.08682",
"region:us"
] |
2023-09-13T05:49:57+00:00
|
{}
|
2023-09-15T03:30:08+00:00
|
[
"2303.08682"
] |
[] |
TAGS
#arxiv-2303.08682 #region-us
|
Processed MIT-Adobe5k datasets for RSFNet
Paper: URL
Code: URL
If our work is helpful for your research, please consider citing:
|
[] |
[
"TAGS\n#arxiv-2303.08682 #region-us \n"
] |
[
14
] |
[
"passage: TAGS\n#arxiv-2303.08682 #region-us \n"
] |
e177d49989cb6ade63c186b829c3e197ea474e76
|
# Dataset Card for "falcon-refinedweb-labeled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
andersonbcdefg/falcon-refinedweb-labeled
|
[
"region:us"
] |
2023-09-13T05:55:25+00:00
|
{"dataset_info": {"features": [{"name": "content", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "dump", "dtype": "string"}, {"name": "class", "dtype": "string"}, {"name": "score", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 289378063, "num_examples": 160956}], "download_size": 184313738, "dataset_size": 289378063}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-13T05:55:40+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "falcon-refinedweb-labeled"
More Information needed
|
[
"# Dataset Card for \"falcon-refinedweb-labeled\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"falcon-refinedweb-labeled\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"falcon-refinedweb-labeled\"\n\nMore Information needed"
] |
8304e6bba176ec75b8c0fece3356c67f831afb2b
|
# Dataset Card for "keyframes_d_d_gripper"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
ShuaKang/keyframes_d_d_gripper
|
[
"region:us"
] |
2023-09-13T06:00:48+00:00
|
{"dataset_info": {"features": [{"name": "keyframes_image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "gripper_image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 711583897.5, "num_examples": 14638}], "download_size": 700376995, "dataset_size": 711583897.5}}
|
2023-09-13T06:13:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "keyframes_d_d_gripper"
More Information needed
|
[
"# Dataset Card for \"keyframes_d_d_gripper\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"keyframes_d_d_gripper\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"keyframes_d_d_gripper\"\n\nMore Information needed"
] |
d112be882e18013911aacfb57724c5cace11dfca
|
# Dataset Card for "20_newsgroups"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
rasgaard/20_newsgroups
|
[
"region:us"
] |
2023-09-13T06:23:58+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "label_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12724811.858405516, "num_examples": 10182}, {"name": "val", "num_bytes": 1414701.1415944847, "num_examples": 1132}, {"name": "test", "num_bytes": 8499585, "num_examples": 7532}], "download_size": 0, "dataset_size": 22639098.0}}
|
2023-09-13T06:25:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "20_newsgroups"
More Information needed
|
[
"# Dataset Card for \"20_newsgroups\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"20_newsgroups\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"20_newsgroups\"\n\nMore Information needed"
] |
bb01474726bc79d3f74048b8b557a05abc517b2b
|
# Dataset Card for "LoRAData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
rukkuhru/LoRAData
|
[
"region:us"
] |
2023-09-13T06:28:32+00:00
|
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11257.0, "num_examples": 5}], "download_size": 23185, "dataset_size": 11257.0}}
|
2023-09-26T05:08:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "LoRAData"
More Information needed
|
[
"# Dataset Card for \"LoRAData\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"LoRAData\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"LoRAData\"\n\nMore Information needed"
] |
150c42d174bbdc9ca07ddf8e6ddb47fac52fd86e
|
# Dataset Card for "Evol-TheVault"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
NamCyan/Evol-TheVault
|
[
"region:us"
] |
2023-09-13T06:35:36+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "instruction", "dtype": "string"}, {"name": "code", "dtype": "string"}, {"name": "tokenized_instruction", "sequence": "string"}, {"name": "type", "dtype": "string"}, {"name": "language", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 175466743, "num_examples": 47797}], "download_size": 55571461, "dataset_size": 175466743}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-15T17:04:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "Evol-TheVault"
More Information needed
|
[
"# Dataset Card for \"Evol-TheVault\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"Evol-TheVault\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"Evol-TheVault\"\n\nMore Information needed"
] |
ba2ffe883f41d7137789d63e3e3c92e4fc30beab
|
# Dataset Card for "sci-llm-60k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
facat/sci-llm-60k
|
[
"region:us"
] |
2023-09-13T07:04:48+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 330020705, "num_examples": 60347}, {"name": "test", "num_bytes": 1111116, "num_examples": 200}], "download_size": 183205878, "dataset_size": 331131821}}
|
2023-09-14T01:18:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "sci-llm-60k"
More Information needed
|
[
"# Dataset Card for \"sci-llm-60k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"sci-llm-60k\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"sci-llm-60k\"\n\nMore Information needed"
] |
e0cdfa4e560f517efff61abdfeff9aa0f06a37b7
|
# merge of some datasets from Alpaca Cot
|
NewstaR/Camildae
|
[
"task_categories:question-answering",
"task_categories:conversational",
"license:apache-2.0",
"COT",
"AlpacaCOT",
"Merge",
"Format",
"region:us"
] |
2023-09-13T07:08:05+00:00
|
{"license": "apache-2.0", "task_categories": ["question-answering", "conversational"], "tags": ["COT", "AlpacaCOT", "Merge", "Format"]}
|
2023-10-14T09:09:07+00:00
|
[] |
[] |
TAGS
#task_categories-question-answering #task_categories-conversational #license-apache-2.0 #COT #AlpacaCOT #Merge #Format #region-us
|
# merge of some datasets from Alpaca Cot
|
[
"# merge of some datasets from Alpaca Cot"
] |
[
"TAGS\n#task_categories-question-answering #task_categories-conversational #license-apache-2.0 #COT #AlpacaCOT #Merge #Format #region-us \n",
"# merge of some datasets from Alpaca Cot"
] |
[
51,
12
] |
[
"passage: TAGS\n#task_categories-question-answering #task_categories-conversational #license-apache-2.0 #COT #AlpacaCOT #Merge #Format #region-us \n# merge of some datasets from Alpaca Cot"
] |
de64a7e1470bb9d04c85206152bc6b00cb28a411
|
# Big Patent Clustering Dataset
This dataset is created for patent classification. It is derived from the [big patent dataset](https://huggingface.co/datasets/big_patent) but only contains a subset of the test set of the original dataset.
The subsets contain only patents which are assigned to one single category in the original dataset.
|
jinaai/big-patent-clustering
|
[
"language:en",
"license:cc-by-4.0",
"region:us"
] |
2023-09-13T07:09:11+00:00
|
{"language": ["en"], "license": "cc-by-4.0"}
|
2023-09-26T09:53:01+00:00
|
[] |
[
"en"
] |
TAGS
#language-English #license-cc-by-4.0 #region-us
|
# Big Patent Clustering Dataset
This dataset is created for patent classification. It is derived from the big patent dataset but only contains a subset of the test set of the original dataset.
The subsets contain only patents which are assigned to one single category in the original dataset.
|
[
"# Big Patent Clustering Dataset\n\nThis dataset is created for patent classification. It is derived from the big patent dataset but only contains a subset of the test set of the original dataset.\nThe subsets contain only patents which are assigned to one single category in the original dataset."
] |
[
"TAGS\n#language-English #license-cc-by-4.0 #region-us \n",
"# Big Patent Clustering Dataset\n\nThis dataset is created for patent classification. It is derived from the big patent dataset but only contains a subset of the test set of the original dataset.\nThe subsets contain only patents which are assigned to one single category in the original dataset."
] |
[
19,
67
] |
[
"passage: TAGS\n#language-English #license-cc-by-4.0 #region-us \n# Big Patent Clustering Dataset\n\nThis dataset is created for patent classification. It is derived from the big patent dataset but only contains a subset of the test set of the original dataset.\nThe subsets contain only patents which are assigned to one single category in the original dataset."
] |
b1f837266cf954209278c33a20dc574cbcc8e8a7
|
# Dataset Card for "prompt-recommendation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Andyrasika/prompt-recommendation
|
[
"region:us"
] |
2023-09-13T07:11:19+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "source", "dtype": "string"}, {"name": "target", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 64111, "num_examples": 100}, {"name": "eval", "num_bytes": 13427, "num_examples": 21}], "download_size": 18652, "dataset_size": 77538}}
|
2023-09-13T07:11:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "prompt-recommendation"
More Information needed
|
[
"# Dataset Card for \"prompt-recommendation\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"prompt-recommendation\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"prompt-recommendation\"\n\nMore Information needed"
] |
fb68b9fcbdedbbf023f0c2a4d27879cccb61a8d9
|
# Dataset Card for "squad_v2_1000_0.90_id"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
tyzhu/squad_v2_1000_0.90_id
|
[
"region:us"
] |
2023-09-13T07:14:13+00:00
|
{"dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int64"}, {"name": "text", "sequence": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 122370541.34995362, "num_examples": 70448}, {"name": "validation", "num_bytes": 1920159, "num_examples": 1000}], "download_size": 5249130, "dataset_size": 124290700.34995362}}
|
2023-09-13T07:14:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "squad_v2_1000_0.90_id"
More Information needed
|
[
"# Dataset Card for \"squad_v2_1000_0.90_id\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_v2_1000_0.90_id\"\n\nMore Information needed"
] |
[
6,
22
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_v2_1000_0.90_id\"\n\nMore Information needed"
] |
59b9bc3030c14e798d08ecdecf1c5c3f1e0edbd6
|
# Dataset Card for "KoAlpaca-v1.1a"
## Project Repo
- Github Repo: [Beomi/KoAlpaca](https://github.com/Beomi/KoAlpaca)
## How to use
```python
>>> from datasets import load_dataset
>>> ds = load_dataset("beomi/KoAlpaca-v1.1a", split="train")
>>> ds
Dataset({
features: ['instruction', 'input', 'output'],
num_rows: 21272
})
```
```python
>>> ds[0]
{'instruction': '양파는 어떤 식물 부위인가요? 그리고 고구마는 뿌리인가요?',
'output': '양파는 잎이 아닌 식물의 줄기 부분입니다. 고구마는 식물의 뿌리 부분입니다. \n\n식물의 부위의 구분에 대해 궁금해하는 분이라면 분명 이 질문에 대한 답을 찾고 있을 것입니다. 양파는 잎이 아닌 줄기 부분입니다. 고구마는 다른 질문과 답변에서 언급된 것과 같이 뿌리 부분입니다. 따라서, 양파는 식물의 줄기 부분이 되고, 고구마는 식물의 뿌리 부분입니다.\n\n 덧붙이는 답변: 고구마 줄기도 볶아먹을 수 있나요? \n\n고구마 줄기도 식용으로 볶아먹을 수 있습니다. 하지만 줄기 뿐만 아니라, 잎, 씨, 뿌리까지 모든 부위가 식용으로 활용되기도 합니다. 다만, 한국에서는 일반적으로 뿌리 부분인 고구마를 주로 먹습니다.',
'url': 'https://kin.naver.com/qna/detail.naver?d1id=11&dirId=1116&docId=55320268'}
|
Taegyuu/KoAlpaca_hira_v1.1a
|
[
"task_categories:text-generation",
"language:ko",
"KoAlpaca",
"region:us"
] |
2023-09-13T07:22:21+00:00
|
{"language": ["ko"], "task_categories": ["text-generation"], "pretty_name": "KoAlpaca_hira_v1.1c", "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "url", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24149775, "num_examples": 21267}], "download_size": 24149775, "dataset_size": 24149775}, "tags": ["KoAlpaca"]}
|
2023-09-13T10:11:19+00:00
|
[] |
[
"ko"
] |
TAGS
#task_categories-text-generation #language-Korean #KoAlpaca #region-us
|
# Dataset Card for "KoAlpaca-v1.1a"
## Project Repo
- Github Repo: Beomi/KoAlpaca
## How to use
'''python
>>> ds[0]
{'instruction': '양파는 어떤 식물 부위인가요? 그리고 고구마는 뿌리인가요?',
'output': '양파는 잎이 아닌 식물의 줄기 부분입니다. 고구마는 식물의 뿌리 부분입니다. \n\n식물의 부위의 구분에 대해 궁금해하는 분이라면 분명 이 질문에 대한 답을 찾고 있을 것입니다. 양파는 잎이 아닌 줄기 부분입니다. 고구마는 다른 질문과 답변에서 언급된 것과 같이 뿌리 부분입니다. 따라서, 양파는 식물의 줄기 부분이 되고, 고구마는 식물의 뿌리 부분입니다.\n\n 덧붙이는 답변: 고구마 줄기도 볶아먹을 수 있나요? \n\n고구마 줄기도 식용으로 볶아먹을 수 있습니다. 하지만 줄기 뿐만 아니라, 잎, 씨, 뿌리까지 모든 부위가 식용으로 활용되기도 합니다. 다만, 한국에서는 일반적으로 뿌리 부분인 고구마를 주로 먹습니다.',
'url': 'URL
|
[
"# Dataset Card for \"KoAlpaca-v1.1a\"",
"## Project Repo\n\n- Github Repo: Beomi/KoAlpaca",
"## How to use\n\n\n\n\n'''python\n>>> ds[0]\n{'instruction': '양파는 어떤 식물 부위인가요? 그리고 고구마는 뿌리인가요?',\n 'output': '양파는 잎이 아닌 식물의 줄기 부분입니다. 고구마는 식물의 뿌리 부분입니다. \\n\\n식물의 부위의 구분에 대해 궁금해하는 분이라면 분명 이 질문에 대한 답을 찾고 있을 것입니다. 양파는 잎이 아닌 줄기 부분입니다. 고구마는 다른 질문과 답변에서 언급된 것과 같이 뿌리 부분입니다. 따라서, 양파는 식물의 줄기 부분이 되고, 고구마는 식물의 뿌리 부분입니다.\\n\\n 덧붙이는 답변: 고구마 줄기도 볶아먹을 수 있나요? \\n\\n고구마 줄기도 식용으로 볶아먹을 수 있습니다. 하지만 줄기 뿐만 아니라, 잎, 씨, 뿌리까지 모든 부위가 식용으로 활용되기도 합니다. 다만, 한국에서는 일반적으로 뿌리 부분인 고구마를 주로 먹습니다.',\n 'url': 'URL"
] |
[
"TAGS\n#task_categories-text-generation #language-Korean #KoAlpaca #region-us \n",
"# Dataset Card for \"KoAlpaca-v1.1a\"",
"## Project Repo\n\n- Github Repo: Beomi/KoAlpaca",
"## How to use\n\n\n\n\n'''python\n>>> ds[0]\n{'instruction': '양파는 어떤 식물 부위인가요? 그리고 고구마는 뿌리인가요?',\n 'output': '양파는 잎이 아닌 식물의 줄기 부분입니다. 고구마는 식물의 뿌리 부분입니다. \\n\\n식물의 부위의 구분에 대해 궁금해하는 분이라면 분명 이 질문에 대한 답을 찾고 있을 것입니다. 양파는 잎이 아닌 줄기 부분입니다. 고구마는 다른 질문과 답변에서 언급된 것과 같이 뿌리 부분입니다. 따라서, 양파는 식물의 줄기 부분이 되고, 고구마는 식물의 뿌리 부분입니다.\\n\\n 덧붙이는 답변: 고구마 줄기도 볶아먹을 수 있나요? \\n\\n고구마 줄기도 식용으로 볶아먹을 수 있습니다. 하지만 줄기 뿐만 아니라, 잎, 씨, 뿌리까지 모든 부위가 식용으로 활용되기도 합니다. 다만, 한국에서는 일반적으로 뿌리 부분인 고구마를 주로 먹습니다.',\n 'url': 'URL"
] |
[
27,
15,
18,
251
] |
[
"passage: TAGS\n#task_categories-text-generation #language-Korean #KoAlpaca #region-us \n# Dataset Card for \"KoAlpaca-v1.1a\"## Project Repo\n\n- Github Repo: Beomi/KoAlpaca## How to use\n\n\n\n\n'''python\n>>> ds[0]\n{'instruction': '양파는 어떤 식물 부위인가요? 그리고 고구마는 뿌리인가요?',\n 'output': '양파는 잎이 아닌 식물의 줄기 부분입니다. 고구마는 식물의 뿌리 부분입니다. \\n\\n식물의 부위의 구분에 대해 궁금해하는 분이라면 분명 이 질문에 대한 답을 찾고 있을 것입니다. 양파는 잎이 아닌 줄기 부분입니다. 고구마는 다른 질문과 답변에서 언급된 것과 같이 뿌리 부분입니다. 따라서, 양파는 식물의 줄기 부분이 되고, 고구마는 식물의 뿌리 부분입니다.\\n\\n 덧붙이는 답변: 고구마 줄기도 볶아먹을 수 있나요? \\n\\n고구마 줄기도 식용으로 볶아먹을 수 있습니다. 하지만 줄기 뿐만 아니라, 잎, 씨, 뿌리까지 모든 부위가 식용으로 활용되기도 합니다. 다만, 한국에서는 일반적으로 뿌리 부분인 고구마를 주로 먹습니다.',\n 'url': 'URL"
] |
f79f092c13085394ba23b147a32febc0c63ad3f3
|
# Dataset Card for "fabric_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
pwc-india/fabric_dataset
|
[
"region:us"
] |
2023-09-13T07:22:30+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 41259319.0, "num_examples": 20}], "download_size": 41261924, "dataset_size": 41259319.0}}
|
2023-09-13T07:39:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "fabric_dataset"
More Information needed
|
[
"# Dataset Card for \"fabric_dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"fabric_dataset\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"fabric_dataset\"\n\nMore Information needed"
] |
6ef094de5d4cb58e877be71bc7f38a862e486415
|
# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama_7b_qlora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_qlora](https://huggingface.co/DevaMalla/llama_7b_qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_qlora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T12:48:55.700412](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora/blob/main/results_2023-10-27T12-48-55.700412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857114,
"f1": 0.05457109899328867,
"f1_stderr": 0.0012808137665620593,
"acc": 0.38304442448507725,
"acc_stderr": 0.009175242098063443
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857114,
"f1": 0.05457109899328867,
"f1_stderr": 0.0012808137665620593
},
"harness|gsm8k|5": {
"acc": 0.045489006823351025,
"acc_stderr": 0.0057396576567222
},
"harness|winogrande|5": {
"acc": 0.7205998421468035,
"acc_stderr": 0.012610826539404686
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_DevaMalla__llama_7b_qlora
|
[
"region:us"
] |
2023-09-13T07:44:16+00:00
|
{"pretty_name": "Evaluation run of DevaMalla/llama_7b_qlora", "dataset_summary": "Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_qlora](https://huggingface.co/DevaMalla/llama_7b_qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_qlora\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T12:48:55.700412](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora/blob/main/results_2023-10-27T12-48-55.700412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857114,\n \"f1\": 0.05457109899328867,\n \"f1_stderr\": 0.0012808137665620593,\n \"acc\": 0.38304442448507725,\n \"acc_stderr\": 0.009175242098063443\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857114,\n \"f1\": 0.05457109899328867,\n \"f1_stderr\": 0.0012808137665620593\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.045489006823351025,\n \"acc_stderr\": 0.0057396576567222\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404686\n }\n}\n```", "repo_url": "https://huggingface.co/DevaMalla/llama_7b_qlora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|arc:challenge|25_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T12_48_55.700412", "path": ["**/details_harness|drop|3_2023-10-27T12-48-55.700412.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T12-48-55.700412.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T12_48_55.700412", "path": ["**/details_harness|gsm8k|5_2023-10-27T12-48-55.700412.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T12-48-55.700412.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hellaswag|10_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T08-44-02.793862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T08-44-02.793862.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T08-44-02.793862.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T12_48_55.700412", "path": ["**/details_harness|winogrande|5_2023-10-27T12-48-55.700412.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T12-48-55.700412.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T08_44_02.793862", "path": ["results_2023-09-13T08-44-02.793862.parquet"]}, {"split": "2023_10_27T12_48_55.700412", "path": ["results_2023-10-27T12-48-55.700412.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T12-48-55.700412.parquet"]}]}]}
|
2023-10-27T11:49:08+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model DevaMalla/llama_7b_qlora on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-27T12:48:55.700412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-27T12:48:55.700412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-27T12:48:55.700412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model DevaMalla/llama_7b_qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T12:48:55.700412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
5effbcb1a35a2b0b830776a3f24eed3b74c5891e
|
# Dataset of mimura_kanako/三村かな子/미무라카나코 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mimura_kanako/三村かな子/미무라카나코 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `brown_hair, short_hair, brown_eyes, breasts, hair_ornament, hair_flower, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 443.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimura_kanako_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 307.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimura_kanako_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1058 | 590.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimura_kanako_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 411.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimura_kanako_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1058 | 753.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mimura_kanako_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mimura_kanako_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, plump, smile, solo, belly, flower, open_mouth, simple_background, white_background, huge_breasts, nipples, bra, cleavage, heart, open_clothes, sweat, wariza, white_panties |
| 1 | 10 |  |  |  |  |  | 1girl, navel, nipples, solo, blush, pussy, flower, smile, looking_at_viewer, completely_nude |
| 2 | 31 |  |  |  |  |  | 1girl, solo, flower, school_uniform, cardigan, plaid_skirt, smile, blush, open_mouth, food |
| 3 | 6 |  |  |  |  |  | :d, blue_bowtie, blush, cardigan, flower, looking_at_viewer, open_mouth, school_uniform, 1girl, blue_skirt, long_sleeves, pleated_skirt, solo, plaid_skirt, simple_background, holding, socks, white_background, white_shirt |
| 4 | 7 |  |  |  |  |  | 1girl, cleavage, dress, open_mouth, solo, smile, blush, thighhighs, elbow_gloves, hairband, medium_breasts |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, blush, flower, hetero, penis, solo_focus, nipples, paizuri, cum_on_breasts, facial, huge_breasts, open_mouth, looking_at_viewer, mosaic_censoring, breast_grab, grabbing, shirt_lift, smile |
| 6 | 14 |  |  |  |  |  | 1boy, 1girl, flower, hetero, solo_focus, blush, navel, nipples, sex, vaginal, censored, nude, pussy, open_mouth, penis, plump, sweat, female_pubic_hair, lying, spread_legs |
| 7 | 6 |  |  |  |  |  | 1girl, long_sleeves, maid_headdress, pink_dress, solo, blush, frilled_sleeves, looking_at_viewer, wide_sleeves, bangs, bow, cleavage, frilled_choker, holding, ribbon, smile, waist_apron, white_thighhighs, flower, frilled_apron, puffy_sleeves, sash |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | navel | plump | smile | solo | belly | flower | open_mouth | simple_background | white_background | huge_breasts | nipples | bra | cleavage | heart | open_clothes | sweat | wariza | white_panties | pussy | completely_nude | school_uniform | cardigan | plaid_skirt | food | :d | blue_bowtie | blue_skirt | long_sleeves | pleated_skirt | holding | socks | white_shirt | dress | thighhighs | elbow_gloves | hairband | medium_breasts | 1boy | hetero | penis | solo_focus | paizuri | cum_on_breasts | facial | mosaic_censoring | breast_grab | grabbing | shirt_lift | sex | vaginal | censored | nude | female_pubic_hair | lying | spread_legs | maid_headdress | pink_dress | frilled_sleeves | wide_sleeves | bangs | bow | frilled_choker | ribbon | waist_apron | white_thighhighs | frilled_apron | puffy_sleeves | sash |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:--------|:--------|:--------|:-------|:--------|:---------|:-------------|:--------------------|:-------------------|:---------------|:----------|:------|:-----------|:--------|:---------------|:--------|:---------|:----------------|:--------|:------------------|:-----------------|:-----------|:--------------|:-------|:-----|:--------------|:-------------|:---------------|:----------------|:----------|:--------|:--------------|:--------|:-------------|:---------------|:-----------|:-----------------|:-------|:---------|:--------|:-------------|:----------|:-----------------|:---------|:-------------------|:--------------|:-----------|:-------------|:------|:----------|:-----------|:-------|:--------------------|:--------|:--------------|:-----------------|:-------------|:------------------|:---------------|:--------|:------|:-----------------|:---------|:--------------|:-------------------|:----------------|:----------------|:-------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | X | X | | X | | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 31 |  |  |  |  |  | X | X | | | | X | X | | X | X | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | | | | X | | X | X | X | X | | | | | | | | | | | | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | | | X | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | X | | | X | | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 6 | 14 |  |  |  |  |  | X | X | | X | X | | | | X | X | | | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | | | X | X | | X | | | | | | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/mimura_kanako_idolmastercinderellagirls
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-13T07:50:08+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-16T13:01:30+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of mimura\_kanako/三村かな子/미무라카나코 (THE iDOLM@STER: Cinderella Girls)
=========================================================================
This is the dataset of mimura\_kanako/三村かな子/미무라카나코 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are 'brown\_hair, short\_hair, brown\_eyes, breasts, hair\_ornament, hair\_flower, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
61065bfa8c3dd4997f5f455d98196cc9951d1688
|
# Dataset Card for Evaluation run of dfurman/falcon-40b-openassistant-peft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dfurman/falcon-40b-openassistant-peft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [dfurman/falcon-40b-openassistant-peft](https://huggingface.co/dfurman/falcon-40b-openassistant-peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T22:59:49.986457](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft/blob/main/results_2023-10-28T22-59-49.986457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004299496644295302,
"em_stderr": 0.0006700586558630089,
"f1": 0.06359060402684574,
"f1_stderr": 0.0014332954865830501,
"acc": 0.4739784570478341,
"acc_stderr": 0.010145228456462492
},
"harness|drop|3": {
"em": 0.004299496644295302,
"em_stderr": 0.0006700586558630089,
"f1": 0.06359060402684574,
"f1_stderr": 0.0014332954865830501
},
"harness|gsm8k|5": {
"acc": 0.133434420015163,
"acc_stderr": 0.00936649160978448
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140505
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft
|
[
"region:us"
] |
2023-09-13T07:57:43+00:00
|
{"pretty_name": "Evaluation run of dfurman/falcon-40b-openassistant-peft", "dataset_summary": "Dataset automatically created during the evaluation run of model [dfurman/falcon-40b-openassistant-peft](https://huggingface.co/dfurman/falcon-40b-openassistant-peft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T22:59:49.986457](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__falcon-40b-openassistant-peft/blob/main/results_2023-10-28T22-59-49.986457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004299496644295302,\n \"em_stderr\": 0.0006700586558630089,\n \"f1\": 0.06359060402684574,\n \"f1_stderr\": 0.0014332954865830501,\n \"acc\": 0.4739784570478341,\n \"acc_stderr\": 0.010145228456462492\n },\n \"harness|drop|3\": {\n \"em\": 0.004299496644295302,\n \"em_stderr\": 0.0006700586558630089,\n \"f1\": 0.06359060402684574,\n \"f1_stderr\": 0.0014332954865830501\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.133434420015163,\n \"acc_stderr\": 0.00936649160978448\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140505\n }\n}\n```", "repo_url": "https://huggingface.co/dfurman/falcon-40b-openassistant-peft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|arc:challenge|25_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T22_59_49.986457", "path": ["**/details_harness|drop|3_2023-10-28T22-59-49.986457.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T22-59-49.986457.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T22_59_49.986457", "path": ["**/details_harness|gsm8k|5_2023-10-28T22-59-49.986457.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T22-59-49.986457.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hellaswag|10_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T08-57-30.972897.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T08-57-30.972897.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T08-57-30.972897.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T22_59_49.986457", "path": ["**/details_harness|winogrande|5_2023-10-28T22-59-49.986457.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T22-59-49.986457.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T08_57_30.972897", "path": ["results_2023-09-13T08-57-30.972897.parquet"]}, {"split": "2023_10_28T22_59_49.986457", "path": ["results_2023-10-28T22-59-49.986457.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T22-59-49.986457.parquet"]}]}]}
|
2023-10-28T22:00:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of dfurman/falcon-40b-openassistant-peft
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model dfurman/falcon-40b-openassistant-peft on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T22:59:49.986457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of dfurman/falcon-40b-openassistant-peft",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dfurman/falcon-40b-openassistant-peft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T22:59:49.986457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dfurman/falcon-40b-openassistant-peft",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model dfurman/falcon-40b-openassistant-peft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T22:59:49.986457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of dfurman/falcon-40b-openassistant-peft## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model dfurman/falcon-40b-openassistant-peft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T22:59:49.986457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
dd36394bb5c6cd27ddf9a7e2f5781e2896493884
|
<h1 align="center">TAL-SCQ5K</h1>
## Dataset Description
### Dataset Summary
TAL-SCQ5K-EN/TAL-SCQ5K-CN are high quality mathematical competition datasets in English and Chinese language created by TAL Education Group, each consisting of 5K questions(3K training and 2K testing). The questions are in the form of multiple-choice and cover mathematical topics at the primary,junior high and high school levels. In addition, detailed solution steps are provided to facilitate CoT training and all the mathematical expressions in the questions have been presented as standard text-mode Latex.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The text in TAL-SCQ5K-EN is in English and TAL-SCQ5K-CN is in Chinese.
## Dataset Structure
### Data Instances
```
{
"dataset_name": "prime_math_competition_en_single_choice_8K_dev",
"dataset_version": "2023-07-07",
"qid": "244",
"queId": "8afc802a8c304199b1040f11ffa2e92a",
"competition_source_list": [],
"difficulty": "2",
"qtype": "single_choice",
"problem": "A $14$-digit. number $666666 XY 444444$ is a multiple of $26$. If $X$ and $Y$ are both positive, what is the smallest vaue of $X+ Y$? ",
"answer_option_list": [
[{
"aoVal": "A",
"content": "$$3$$ "
}],
[{
"aoVal": "B",
"content": "$$4$$ "
}],
[{
"aoVal": "C",
"content": "$$9$$ "
}],
[{
"aoVal": "D",
"content": "$$14$$ "
}],
[{
"aoVal": "E",
"content": "None of the above "
}]
],
"knowledge_point_routes": ["Overseas Competition->Knowledge Point->Number Theory Modules->Division without Remainders->Divisibility Rules"],
"answer_analysis": ["Since $1001$ is a multiple of $13$, $111111 = 111 \\times 1001$ is also a multiple of $13$. It follows that both $666666$ and $444444$ are both multiples of $26$. $666666XY 444444 = 66666600000000 + XY 000000 + 444444$ $\\Rightarrow XY$ must be divisible by $13$. Smallest $X+Y=1+3=4$. "],
"answer_value": "B"
}
```
### Data Fields
* "dataset_name": identification of the source dataset name from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.
* "dataset_version": identification of the source dataset version from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.
* "qid": identification of local id of the question in the source dataset from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.
* "queId": identification of global id of the question, use only for inner of TAL education group, please ignore.
* "competition_source_list": identification of math competitions in which the questions appeared, if have been logged.
* "difficulty": difficulty level of the questions, value ranged from 0 to 4
* "qtype": question type, valued as "single_choice" for all the questions in this dataset indicates that all the questions are multiple-choice questions with unique ground-truth answer.
* "problem": the question string to a math competition question.
* "answer_option_list": answer choices to be selected
* "knowledge_point_routes": knowledge point route from coarse-grained to fine-grained.
* "answer_analysis": step-by-step answer analysis of the questions, which helps CoT training
* "answer_value": value of the ground-truth answer choice
### Data Splits
<style>
table th:first-of-type {
width: 40%;
}
table th:nth-of-type(2) {
width: 30%;
}
table th:nth-of-type(3) {
width: 30%;
}
</style>
| name|train|test |
|:---:|:----:|:----:|
|TAL-SCQ5K-EN|3K |2K |
|TAL-SCQ5K-CN|3K |2K |
## Usage
Each of the above datasets is located in a separate sub-directory. To load an individual subset, use the data_dir argument of the load_dataset() function as follows:
```python
from datasets import load_dataset
# Load all subsets (share the same schema)
dataset = load_dataset("math-eval/TAL-SCQ5K")
# Load TAL-SCQ5K-EN
dataset = load_dataset("math-eval/TAL-SCQ5K", data_dir="TAL-SCQ5K-EN")
# Load TAL-SCQ5K-CN
dataset = load_dataset("math-eval/TAL-SCQ5K", data_dir="TAL-SCQ5K-CN")
```
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The TAL-SCQ5K dataset is licensed under the [MIT License](https://opensource.org/license/mit/)
### Citation Information
[More Information Needed]
### Contact
The original authors host this dataset on GitHub here: https://github.com/math-eval/TAL-SCQ5K You can submit inquiries to: [email protected]
|
math-eval/TAL-SCQ5K
|
[
"license:mit",
"region:us"
] |
2023-09-13T07:58:01+00:00
|
{"license": "mit"}
|
2023-09-15T05:37:10+00:00
|
[] |
[] |
TAGS
#license-mit #region-us
|
TAL-SCQ5K
=========
Dataset Description
-------------------
### Dataset Summary
TAL-SCQ5K-EN/TAL-SCQ5K-CN are high quality mathematical competition datasets in English and Chinese language created by TAL Education Group, each consisting of 5K questions(3K training and 2K testing). The questions are in the form of multiple-choice and cover mathematical topics at the primary,junior high and high school levels. In addition, detailed solution steps are provided to facilitate CoT training and all the mathematical expressions in the questions have been presented as standard text-mode Latex.
### Supported Tasks and Leaderboards
### Languages
The text in TAL-SCQ5K-EN is in English and TAL-SCQ5K-CN is in Chinese.
Dataset Structure
-----------------
### Data Instances
### Data Fields
* "dataset\_name": identification of the source dataset name from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.
* "dataset\_version": identification of the source dataset version from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.
* "qid": identification of local id of the question in the source dataset from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.
* "queId": identification of global id of the question, use only for inner of TAL education group, please ignore.
* "competition\_source\_list": identification of math competitions in which the questions appeared, if have been logged.
* "difficulty": difficulty level of the questions, value ranged from 0 to 4
* "qtype": question type, valued as "single\_choice" for all the questions in this dataset indicates that all the questions are multiple-choice questions with unique ground-truth answer.
* "problem": the question string to a math competition question.
* "answer\_option\_list": answer choices to be selected
* "knowledge\_point\_routes": knowledge point route from coarse-grained to fine-grained.
* "answer\_analysis": step-by-step answer analysis of the questions, which helps CoT training
* "answer\_value": value of the ground-truth answer choice
### Data Splits
table th:first-of-type {
width: 40%;
}
table th:nth-of-type(2) {
width: 30%;
}
table th:nth-of-type(3) {
width: 30%;
}
Usage
-----
Each of the above datasets is located in a separate sub-directory. To load an individual subset, use the data\_dir argument of the load\_dataset() function as follows:
Additional Information
----------------------
### Dataset Curators
### Licensing Information
The TAL-SCQ5K dataset is licensed under the MIT License
### Contact
The original authors host this dataset on GitHub here: URL You can submit inquiries to: URL@URL
|
[
"### Dataset Summary\n\n\nTAL-SCQ5K-EN/TAL-SCQ5K-CN are high quality mathematical competition datasets in English and Chinese language created by TAL Education Group, each consisting of 5K questions(3K training and 2K testing). The questions are in the form of multiple-choice and cover mathematical topics at the primary,junior high and high school levels. In addition, detailed solution steps are provided to facilitate CoT training and all the mathematical expressions in the questions have been presented as standard text-mode Latex.",
"### Supported Tasks and Leaderboards",
"### Languages\n\n\nThe text in TAL-SCQ5K-EN is in English and TAL-SCQ5K-CN is in Chinese.\n\n\nDataset Structure\n-----------------",
"### Data Instances",
"### Data Fields\n\n\n* \"dataset\\_name\": identification of the source dataset name from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.\n* \"dataset\\_version\": identification of the source dataset version from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.\n* \"qid\": identification of local id of the question in the source dataset from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.\n* \"queId\": identification of global id of the question, use only for inner of TAL education group, please ignore.\n* \"competition\\_source\\_list\": identification of math competitions in which the questions appeared, if have been logged.\n* \"difficulty\": difficulty level of the questions, value ranged from 0 to 4\n* \"qtype\": question type, valued as \"single\\_choice\" for all the questions in this dataset indicates that all the questions are multiple-choice questions with unique ground-truth answer.\n* \"problem\": the question string to a math competition question.\n* \"answer\\_option\\_list\": answer choices to be selected\n* \"knowledge\\_point\\_routes\": knowledge point route from coarse-grained to fine-grained.\n* \"answer\\_analysis\": step-by-step answer analysis of the questions, which helps CoT training\n* \"answer\\_value\": value of the ground-truth answer choice",
"### Data Splits\n\n\n\ntable th:first-of-type {\n width: 40%;\n}\ntable th:nth-of-type(2) {\n width: 30%;\n}\ntable th:nth-of-type(3) {\n width: 30%;\n}\n\n\nUsage\n-----\n\n\nEach of the above datasets is located in a separate sub-directory. To load an individual subset, use the data\\_dir argument of the load\\_dataset() function as follows:\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information\n\n\nThe TAL-SCQ5K dataset is licensed under the MIT License",
"### Contact\n\n\nThe original authors host this dataset on GitHub here: URL You can submit inquiries to: URL@URL"
] |
[
"TAGS\n#license-mit #region-us \n",
"### Dataset Summary\n\n\nTAL-SCQ5K-EN/TAL-SCQ5K-CN are high quality mathematical competition datasets in English and Chinese language created by TAL Education Group, each consisting of 5K questions(3K training and 2K testing). The questions are in the form of multiple-choice and cover mathematical topics at the primary,junior high and high school levels. In addition, detailed solution steps are provided to facilitate CoT training and all the mathematical expressions in the questions have been presented as standard text-mode Latex.",
"### Supported Tasks and Leaderboards",
"### Languages\n\n\nThe text in TAL-SCQ5K-EN is in English and TAL-SCQ5K-CN is in Chinese.\n\n\nDataset Structure\n-----------------",
"### Data Instances",
"### Data Fields\n\n\n* \"dataset\\_name\": identification of the source dataset name from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.\n* \"dataset\\_version\": identification of the source dataset version from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.\n* \"qid\": identification of local id of the question in the source dataset from which TAL-SCQ5K-EN/TAL-SCQ5K-CN has been created, use only for inner of TAL education group, please ignore.\n* \"queId\": identification of global id of the question, use only for inner of TAL education group, please ignore.\n* \"competition\\_source\\_list\": identification of math competitions in which the questions appeared, if have been logged.\n* \"difficulty\": difficulty level of the questions, value ranged from 0 to 4\n* \"qtype\": question type, valued as \"single\\_choice\" for all the questions in this dataset indicates that all the questions are multiple-choice questions with unique ground-truth answer.\n* \"problem\": the question string to a math competition question.\n* \"answer\\_option\\_list\": answer choices to be selected\n* \"knowledge\\_point\\_routes\": knowledge point route from coarse-grained to fine-grained.\n* \"answer\\_analysis\": step-by-step answer analysis of the questions, which helps CoT training\n* \"answer\\_value\": value of the ground-truth answer choice",
"### Data Splits\n\n\n\ntable th:first-of-type {\n width: 40%;\n}\ntable th:nth-of-type(2) {\n width: 30%;\n}\ntable th:nth-of-type(3) {\n width: 30%;\n}\n\n\nUsage\n-----\n\n\nEach of the above datasets is located in a separate sub-directory. To load an individual subset, use the data\\_dir argument of the load\\_dataset() function as follows:\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information\n\n\nThe TAL-SCQ5K dataset is licensed under the MIT License",
"### Contact\n\n\nThe original authors host this dataset on GitHub here: URL You can submit inquiries to: URL@URL"
] |
[
11,
126,
10,
38,
6,
396,
111,
6,
22,
29
] |
[
"passage: TAGS\n#license-mit #region-us \n### Dataset Summary\n\n\nTAL-SCQ5K-EN/TAL-SCQ5K-CN are high quality mathematical competition datasets in English and Chinese language created by TAL Education Group, each consisting of 5K questions(3K training and 2K testing). The questions are in the form of multiple-choice and cover mathematical topics at the primary,junior high and high school levels. In addition, detailed solution steps are provided to facilitate CoT training and all the mathematical expressions in the questions have been presented as standard text-mode Latex.### Supported Tasks and Leaderboards### Languages\n\n\nThe text in TAL-SCQ5K-EN is in English and TAL-SCQ5K-CN is in Chinese.\n\n\nDataset Structure\n-----------------### Data Instances"
] |
90eb64e4587d9db0d2b085110d68e8753fb57b88
|
# Dataset Card for "address_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
konverner/fr-address
|
[
"region:us"
] |
2023-09-13T07:58:32+00:00
|
{"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 1399540, "num_examples": 5500}], "download_size": 208333, "dataset_size": 1399540}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-13T08:36:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "address_dataset"
More Information needed
|
[
"# Dataset Card for \"address_dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"address_dataset\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"address_dataset\"\n\nMore Information needed"
] |
4f5432239cd9b401a65af54008752ad7fcad44a9
|
# Dataset Card for "sales1-formatted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
pssubitha/sales1-formatted
|
[
"region:us"
] |
2023-09-13T08:20:56+00:00
|
{"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 43483, "num_examples": 120}], "download_size": 25761, "dataset_size": 43483}}
|
2023-09-13T10:09:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "sales1-formatted"
More Information needed
|
[
"# Dataset Card for \"sales1-formatted\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"sales1-formatted\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"sales1-formatted\"\n\nMore Information needed"
] |
778f033637abe780df7126809cc05edd88b1fd9d
|
hello
|
rraileanu/dreamcraft_planet_mc_quant
|
[
"region:us"
] |
2023-09-13T08:25:02+00:00
|
{}
|
2023-09-13T08:45:11+00:00
|
[] |
[] |
TAGS
#region-us
|
hello
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
b4da3a069eb2a6df74af09938e2fef76c195bf22
|
# Dataset Card for "paper_test_assym_squeezebert_results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
nikchar/paper_test_assym_squeezebert_results
|
[
"region:us"
] |
2023-09-13T08:28:14+00:00
|
{"dataset_info": {"features": [{"name": "claim", "dtype": "string"}, {"name": "evidence_wiki_url", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "retrieved_evidence_title", "sequence": "string"}, {"name": "retrieved_evidence_text", "sequence": "string"}, {"name": "labels", "dtype": "int64"}, {"name": "Retrieval_Success", "dtype": "bool"}, {"name": "Predicted_Labels", "dtype": "int64"}, {"name": "Predicted_Labels_Each_doc", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 73601741, "num_examples": 11073}], "download_size": 34426539, "dataset_size": 73601741}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-13T08:28:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "paper_test_assym_squeezebert_results"
More Information needed
|
[
"# Dataset Card for \"paper_test_assym_squeezebert_results\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"paper_test_assym_squeezebert_results\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"paper_test_assym_squeezebert_results\"\n\nMore Information needed"
] |
18b1dd0d252300a16d9474a8679093a8a14aff93
|
# Dataset Card for "madras_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
pwc-india/madras_dataset
|
[
"region:us"
] |
2023-09-13T08:30:27+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 22751754.0, "num_examples": 10}], "download_size": 22753302, "dataset_size": 22751754.0}}
|
2023-09-13T08:30:29+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "madras_dataset"
More Information needed
|
[
"# Dataset Card for \"madras_dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"madras_dataset\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"madras_dataset\"\n\nMore Information needed"
] |
1d49f5a1dc46bf07ead6fa336e39e7c393119f19
|
This dataset is mainly produced to perform voice cloning task. For voice cloning we need dataset which is basically a audio file, so this file has
283 voice samples of Barack Obama in .wav format after processing and updating metadata.
|
RaysDipesh/obama-voice-samples-283
|
[
"region:us"
] |
2023-09-13T08:42:20+00:00
|
{}
|
2023-09-13T09:02:55+00:00
|
[] |
[] |
TAGS
#region-us
|
This dataset is mainly produced to perform voice cloning task. For voice cloning we need dataset which is basically a audio file, so this file has
283 voice samples of Barack Obama in .wav format after processing and updating metadata.
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
fbde5c52f9191c09fa924214db54fb9c13dd0a00
|
# Dataset Card for "music_align_music_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
ryanc/music_align_music_qa
|
[
"region:us"
] |
2023-09-13T08:42:32+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "caption", "sequence": "string"}, {"name": "audio", "dtype": "audio"}], "splits": [{"name": "train", "num_bytes": 34030062976.128, "num_examples": 13102}], "download_size": 4954671190, "dataset_size": 34030062976.128}}
|
2023-09-14T06:32:34+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "music_align_music_qa"
More Information needed
|
[
"# Dataset Card for \"music_align_music_qa\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"music_align_music_qa\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"music_align_music_qa\"\n\nMore Information needed"
] |
ee5718bd6649f64747e393fa10eac51bdaffa9f3
|
# Dataset Card for Evaluation run of budecosystem/genz-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/budecosystem/genz-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [budecosystem/genz-70b](https://huggingface.co/budecosystem/genz-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_budecosystem__genz-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T19:01:32.642131](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-70b/blob/main/results_2023-10-23T19-01-32.642131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.421875,
"em_stderr": 0.005057576044482799,
"f1": 0.5428481543624201,
"f1_stderr": 0.004562270615925701,
"acc": 0.5862101051177826,
"acc_stderr": 0.011727291302229777
},
"harness|drop|3": {
"em": 0.421875,
"em_stderr": 0.005057576044482799,
"f1": 0.5428481543624201,
"f1_stderr": 0.004562270615925701
},
"harness|gsm8k|5": {
"acc": 0.3373768006065201,
"acc_stderr": 0.013023665136222105
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237448
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_budecosystem__genz-70b
|
[
"region:us"
] |
2023-09-13T08:54:20+00:00
|
{"pretty_name": "Evaluation run of budecosystem/genz-70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [budecosystem/genz-70b](https://huggingface.co/budecosystem/genz-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_budecosystem__genz-70b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T19:01:32.642131](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-70b/blob/main/results_2023-10-23T19-01-32.642131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.421875,\n \"em_stderr\": 0.005057576044482799,\n \"f1\": 0.5428481543624201,\n \"f1_stderr\": 0.004562270615925701,\n \"acc\": 0.5862101051177826,\n \"acc_stderr\": 0.011727291302229777\n },\n \"harness|drop|3\": {\n \"em\": 0.421875,\n \"em_stderr\": 0.005057576044482799,\n \"f1\": 0.5428481543624201,\n \"f1_stderr\": 0.004562270615925701\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3373768006065201,\n \"acc_stderr\": 0.013023665136222105\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237448\n }\n}\n```", "repo_url": "https://huggingface.co/budecosystem/genz-70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|arc:challenge|25_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T19_01_32.642131", "path": ["**/details_harness|drop|3_2023-10-23T19-01-32.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T19-01-32.642131.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T19_01_32.642131", "path": ["**/details_harness|gsm8k|5_2023-10-23T19-01-32.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T19-01-32.642131.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hellaswag|10_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T09-54-04.852738.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T09-54-04.852738.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T09-54-04.852738.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T19_01_32.642131", "path": ["**/details_harness|winogrande|5_2023-10-23T19-01-32.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T19-01-32.642131.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T09_54_04.852738", "path": ["results_2023-09-13T09-54-04.852738.parquet"]}, {"split": "2023_10_23T19_01_32.642131", "path": ["results_2023-10-23T19-01-32.642131.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T19-01-32.642131.parquet"]}]}]}
|
2023-10-23T18:01:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of budecosystem/genz-70b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model budecosystem/genz-70b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T19:01:32.642131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of budecosystem/genz-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model budecosystem/genz-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T19:01:32.642131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of budecosystem/genz-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model budecosystem/genz-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T19:01:32.642131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of budecosystem/genz-70b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model budecosystem/genz-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T19:01:32.642131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d1bb0e2858406a254a0f150b68da875c34acd67a
|
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
qayqaq/github-issues
|
[
"region:us"
] |
2023-09-13T08:55:45+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "url", "dtype": "string"}, {"name": "repository_url", "dtype": "string"}, {"name": "labels_url", "dtype": "string"}, {"name": "comments_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "user", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "labels", "list": [{"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "color", "dtype": "string"}, {"name": "default", "dtype": "bool"}, {"name": "description", "dtype": "string"}]}, {"name": "state", "dtype": "string"}, {"name": "locked", "dtype": "bool"}, {"name": "assignee", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "assignees", "list": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "comments", "sequence": "string"}, {"name": "created_at", "dtype": "timestamp[s]"}, {"name": "updated_at", "dtype": "timestamp[s]"}, {"name": "closed_at", "dtype": "timestamp[s]"}, {"name": "author_association", "dtype": "string"}, {"name": "body", "dtype": "string"}, {"name": "reactions", "struct": [{"name": "url", "dtype": "string"}, {"name": "total_count", "dtype": "int64"}, {"name": "+1", "dtype": "int64"}, {"name": "-1", "dtype": "int64"}, {"name": "laugh", "dtype": "int64"}, {"name": "hooray", "dtype": "int64"}, {"name": "confused", "dtype": "int64"}, {"name": "heart", "dtype": "int64"}, {"name": "rocket", "dtype": "int64"}, {"name": "eyes", "dtype": "int64"}]}, {"name": "timeline_url", "dtype": "string"}, {"name": "state_reason", "dtype": "string"}, {"name": "draft", "dtype": "bool"}, {"name": "pull_request", "struct": [{"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "diff_url", "dtype": "string"}, {"name": "patch_url", "dtype": "string"}, {"name": "merged_at", "dtype": "timestamp[s]"}]}, {"name": "is_pull_request", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 11732758, "num_examples": 1000}], "download_size": 3212220, "dataset_size": 11732758}}
|
2023-09-13T08:55:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "github-issues"
More Information needed
|
[
"# Dataset Card for \"github-issues\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"github-issues\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"github-issues\"\n\nMore Information needed"
] |
45e39a5135982c57d2cee4557e1a1a7b1bf0b0e1
|
# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/pankajmathur/orca_mini_v3_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_7b](https://huggingface.co/pankajmathur/orca_mini_v3_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T09:53:37.786344](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b/blob/main/results_2023-10-24T09-53-37.786344.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08043204697986577,
"em_stderr": 0.0027851341980506704,
"f1": 0.15059563758389252,
"f1_stderr": 0.0030534563383277672,
"acc": 0.4069827001752661,
"acc_stderr": 0.009686225873410097
},
"harness|drop|3": {
"em": 0.08043204697986577,
"em_stderr": 0.0027851341980506704,
"f1": 0.15059563758389252,
"f1_stderr": 0.0030534563383277672
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954491
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.012285989618865706
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b
|
[
"region:us"
] |
2023-09-13T08:57:03+00:00
|
{"pretty_name": "Evaluation run of pankajmathur/orca_mini_v3_7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [pankajmathur/orca_mini_v3_7b](https://huggingface.co/pankajmathur/orca_mini_v3_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T09:53:37.786344](https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v3_7b/blob/main/results_2023-10-24T09-53-37.786344.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08043204697986577,\n \"em_stderr\": 0.0027851341980506704,\n \"f1\": 0.15059563758389252,\n \"f1_stderr\": 0.0030534563383277672,\n \"acc\": 0.4069827001752661,\n \"acc_stderr\": 0.009686225873410097\n },\n \"harness|drop|3\": {\n \"em\": 0.08043204697986577,\n \"em_stderr\": 0.0027851341980506704,\n \"f1\": 0.15059563758389252,\n \"f1_stderr\": 0.0030534563383277672\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865706\n }\n}\n```", "repo_url": "https://huggingface.co/pankajmathur/orca_mini_v3_7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|arc:challenge|25_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T09_53_37.786344", "path": ["**/details_harness|drop|3_2023-10-24T09-53-37.786344.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T09-53-37.786344.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T09_53_37.786344", "path": ["**/details_harness|gsm8k|5_2023-10-24T09-53-37.786344.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T09-53-37.786344.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hellaswag|10_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T09-56-47.532864.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T09-56-47.532864.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T09_53_37.786344", "path": ["**/details_harness|winogrande|5_2023-10-24T09-53-37.786344.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T09-53-37.786344.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T09_56_47.532864", "path": ["results_2023-09-13T09-56-47.532864.parquet"]}, {"split": "2023_10_24T09_53_37.786344", "path": ["results_2023-10-24T09-53-37.786344.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T09-53-37.786344.parquet"]}]}]}
|
2023-10-24T08:53:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model pankajmathur/orca_mini_v3_7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T09:53:37.786344(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model pankajmathur/orca_mini_v3_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T09:53:37.786344(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model pankajmathur/orca_mini_v3_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T09:53:37.786344(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of pankajmathur/orca_mini_v3_7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model pankajmathur/orca_mini_v3_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T09:53:37.786344(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a5225b0c648977afad6b33d122895fe3d4717a99
|
# Dataset Card for "openhermes-k8"
[teknium/openhermes](https://hf.co/datasets/teknium/openhermes) clustered with 8 clusters, included are the centroids in 'centers.pt'
|
crumb/openhermes-k8
|
[
"region:us"
] |
2023-09-13T08:57:58+00:00
|
{"dataset_info": {"features": [{"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "cluster", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 309315994, "num_examples": 242831}], "download_size": 143821416, "dataset_size": 309315994}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-13T09:02:45+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "openhermes-k8"
teknium/openhermes clustered with 8 clusters, included are the centroids in 'URL'
|
[
"# Dataset Card for \"openhermes-k8\"\n\nteknium/openhermes clustered with 8 clusters, included are the centroids in 'URL'"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"openhermes-k8\"\n\nteknium/openhermes clustered with 8 clusters, included are the centroids in 'URL'"
] |
[
6,
35
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"openhermes-k8\"\n\nteknium/openhermes clustered with 8 clusters, included are the centroids in 'URL'"
] |
ef078f84ff039182f72291486880f7c4c7f05689
|
# Dataset Card for Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [conceptofmind/Hermes-LLongMA-2-7b-8k](https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T10:12:42.075501](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k/blob/main/results_2023-09-13T10-12-42.075501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2927085297989137,
"acc_stderr": 0.03275517148401362,
"acc_norm": 0.29622047886660385,
"acc_norm_stderr": 0.032746678820457335,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.38838708556166845,
"mc2_stderr": 0.014198737236851828
},
"harness|arc:challenge|25": {
"acc": 0.46928327645051193,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.4974402730375427,
"acc_norm_stderr": 0.014611199329843777
},
"harness|hellaswag|10": {
"acc": 0.5497908783110934,
"acc_stderr": 0.004964979120927565,
"acc_norm": 0.7288388767177854,
"acc_norm_stderr": 0.004436505187567003
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.02688064788905197,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.02688064788905197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165085,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165085
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749895,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749895
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1921182266009852,
"acc_stderr": 0.02771931570961478,
"acc_norm": 0.1921182266009852,
"acc_norm_stderr": 0.02771931570961478
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02176373368417392,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02176373368417392
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.02578787422095932,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.02578787422095932
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868956,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869662,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869662
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3881856540084388,
"acc_stderr": 0.0317229500433233,
"acc_norm": 0.3881856540084388,
"acc_norm_stderr": 0.0317229500433233
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.273542600896861,
"acc_stderr": 0.029918586707798824,
"acc_norm": 0.273542600896861,
"acc_norm_stderr": 0.029918586707798824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3504273504273504,
"acc_stderr": 0.03125610824421881,
"acc_norm": 0.3504273504273504,
"acc_norm_stderr": 0.03125610824421881
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2988505747126437,
"acc_stderr": 0.01636925681509314,
"acc_norm": 0.2988505747126437,
"acc_norm_stderr": 0.01636925681509314
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.024685316867257796,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.024685316867257796
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3300653594771242,
"acc_stderr": 0.026925654653615686,
"acc_norm": 0.3300653594771242,
"acc_norm_stderr": 0.026925654653615686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28748370273794005,
"acc_stderr": 0.011559337355708502,
"acc_norm": 0.28748370273794005,
"acc_norm_stderr": 0.011559337355708502
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.018185218954318075,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.018185218954318075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4163265306122449,
"acc_stderr": 0.03155782816556165,
"acc_norm": 0.4163265306122449,
"acc_norm_stderr": 0.03155782816556165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.34328358208955223,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.34328358208955223,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.39766081871345027,
"acc_stderr": 0.0375363895576169,
"acc_norm": 0.39766081871345027,
"acc_norm_stderr": 0.0375363895576169
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.38838708556166845,
"mc2_stderr": 0.014198737236851828
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k
|
[
"region:us"
] |
2023-09-13T09:12:58+00:00
|
{"pretty_name": "Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k", "dataset_summary": "Dataset automatically created during the evaluation run of model [conceptofmind/Hermes-LLongMA-2-7b-8k](https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-13T10:12:42.075501](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__Hermes-LLongMA-2-7b-8k/blob/main/results_2023-09-13T10-12-42.075501.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2927085297989137,\n \"acc_stderr\": 0.03275517148401362,\n \"acc_norm\": 0.29622047886660385,\n \"acc_norm_stderr\": 0.032746678820457335,\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.38838708556166845,\n \"mc2_stderr\": 0.014198737236851828\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.46928327645051193,\n \"acc_stderr\": 0.014583792546304038,\n \"acc_norm\": 0.4974402730375427,\n \"acc_norm_stderr\": 0.014611199329843777\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5497908783110934,\n \"acc_stderr\": 0.004964979120927565,\n \"acc_norm\": 0.7288388767177854,\n \"acc_norm_stderr\": 0.004436505187567003\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.02688064788905197,\n \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.02688064788905197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165085,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165085\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749895,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749895\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1921182266009852,\n \"acc_stderr\": 0.02771931570961478,\n \"acc_norm\": 0.1921182266009852,\n \"acc_norm_stderr\": 0.02771931570961478\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.03756335775187896,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.03756335775187896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.02176373368417392,\n \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.02176373368417392\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.02578787422095932,\n \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.02578787422095932\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869662,\n \"acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869662\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03308611113236435,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03308611113236435\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3881856540084388,\n \"acc_stderr\": 0.0317229500433233,\n \"acc_norm\": 0.3881856540084388,\n \"acc_norm_stderr\": 0.0317229500433233\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.273542600896861,\n \"acc_stderr\": 0.029918586707798824,\n \"acc_norm\": 0.273542600896861,\n \"acc_norm_stderr\": 0.029918586707798824\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3504273504273504,\n \"acc_stderr\": 0.03125610824421881,\n \"acc_norm\": 0.3504273504273504,\n \"acc_norm_stderr\": 0.03125610824421881\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2988505747126437,\n \"acc_stderr\": 0.01636925681509314,\n \"acc_norm\": 0.2988505747126437,\n \"acc_norm_stderr\": 0.01636925681509314\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.024685316867257796,\n \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.024685316867257796\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3300653594771242,\n \"acc_stderr\": 0.026925654653615686,\n \"acc_norm\": 0.3300653594771242,\n \"acc_norm_stderr\": 0.026925654653615686\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621344,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621344\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28748370273794005,\n \"acc_stderr\": 0.011559337355708502,\n \"acc_norm\": 0.28748370273794005,\n \"acc_norm_stderr\": 0.011559337355708502\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.28104575163398693,\n \"acc_stderr\": 0.018185218954318075,\n \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.018185218954318075\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4163265306122449,\n \"acc_stderr\": 0.03155782816556165,\n \"acc_norm\": 0.4163265306122449,\n \"acc_norm_stderr\": 0.03155782816556165\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.34328358208955223,\n \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.34328358208955223,\n \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.39766081871345027,\n \"acc_stderr\": 0.0375363895576169,\n \"acc_norm\": 0.39766081871345027,\n \"acc_norm_stderr\": 0.0375363895576169\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.38838708556166845,\n \"mc2_stderr\": 0.014198737236851828\n }\n}\n```", "repo_url": "https://huggingface.co/conceptofmind/Hermes-LLongMA-2-7b-8k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|arc:challenge|25_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hellaswag|10_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-13T10-12-42.075501.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_13T10_12_42.075501", "path": ["results_2023-09-13T10-12-42.075501.parquet"]}, {"split": "latest", "path": ["results_2023-09-13T10-12-42.075501.parquet"]}]}]}
|
2023-09-13T09:13:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model conceptofmind/Hermes-LLongMA-2-7b-8k on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-13T10:12:42.075501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model conceptofmind/Hermes-LLongMA-2-7b-8k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-13T10:12:42.075501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model conceptofmind/Hermes-LLongMA-2-7b-8k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-13T10:12:42.075501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of conceptofmind/Hermes-LLongMA-2-7b-8k## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model conceptofmind/Hermes-LLongMA-2-7b-8k on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-13T10:12:42.075501(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
597b3052e0a99a5e29d7bf585180781f5dd1a68e
|
This is the same dataset as [`ag_news`](https://huggingface.co/datasets/ag_news).
The only differences are
1. Addition of a unique identifier, `uid`
1. Addition of the indices, that is 3 columns with the embeddings of 3 different sentence-transformers
- `all-mpnet-base-v2`
- `multi-qa-mpnet-base-dot-v1`
- `all-MiniLM-L12-v2`
1. Renaming of the `label` column to `labels` for easier compatibility with the transformers library
|
pietrolesci/agnews
|
[
"task_categories:text-classification",
"size_categories:100K<n<1M",
"language:en",
"region:us"
] |
2023-09-13T09:17:01+00:00
|
{"language": ["en"], "size_categories": ["100K<n<1M"], "task_categories": ["text-classification"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}, {"config_name": "embedding_all-MiniLM-L12-v2", "data_files": [{"split": "train", "path": "embedding_all-MiniLM-L12-v2/train-*"}, {"split": "test", "path": "embedding_all-MiniLM-L12-v2/test-*"}]}, {"config_name": "embedding_all-mpnet-base-v2", "data_files": [{"split": "train", "path": "embedding_all-mpnet-base-v2/train-*"}, {"split": "test", "path": "embedding_all-mpnet-base-v2/test-*"}]}, {"config_name": "embedding_multi-qa-mpnet-base-dot-v1", "data_files": [{"split": "train", "path": "embedding_multi-qa-mpnet-base-dot-v1/train-*"}, {"split": "test", "path": "embedding_multi-qa-mpnet-base-dot-v1/test-*"}]}], "dataset_info": [{"config_name": "default", "features": [{"name": "text", "dtype": "string"}, {"name": "labels", "dtype": {"class_label": {"names": {"0": "World", "1": "Sports", "2": "Business", "3": "Sci/Tech"}}}}, {"name": "uid", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 30777303, "num_examples": 120000}, {"name": "test", "num_bytes": 1940274, "num_examples": 7600}], "download_size": 20531429, "dataset_size": 32717577}, {"config_name": "embedding_all-MiniLM-L12-v2", "features": [{"name": "uid", "dtype": "int64"}, {"name": "embedding_all-MiniLM-L12-v2", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 185760000, "num_examples": 120000}, {"name": "test", "num_bytes": 11764800, "num_examples": 7600}], "download_size": 276467219, "dataset_size": 197524800}, {"config_name": "embedding_all-mpnet-base-v2", "features": [{"name": "uid", "dtype": "int64"}, {"name": "embedding_all-mpnet-base-v2", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 370080000, "num_examples": 120000}, {"name": "test", "num_bytes": 23438400, "num_examples": 7600}], "download_size": 472647323, "dataset_size": 393518400}, {"config_name": "embedding_multi-qa-mpnet-base-dot-v1", "features": [{"name": "uid", "dtype": "int64"}, {"name": "embedding_multi-qa-mpnet-base-dot-v1", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 370080000, "num_examples": 120000}, {"name": "test", "num_bytes": 23438400, "num_examples": 7600}], "download_size": 472640830, "dataset_size": 393518400}]}
|
2023-09-13T11:02:12+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-text-classification #size_categories-100K<n<1M #language-English #region-us
|
This is the same dataset as 'ag_news'.
The only differences are
1. Addition of a unique identifier, 'uid'
1. Addition of the indices, that is 3 columns with the embeddings of 3 different sentence-transformers
- 'all-mpnet-base-v2'
- 'multi-qa-mpnet-base-dot-v1'
- 'all-MiniLM-L12-v2'
1. Renaming of the 'label' column to 'labels' for easier compatibility with the transformers library
|
[] |
[
"TAGS\n#task_categories-text-classification #size_categories-100K<n<1M #language-English #region-us \n"
] |
[
33
] |
[
"passage: TAGS\n#task_categories-text-classification #size_categories-100K<n<1M #language-English #region-us \n"
] |
8171b441d7f68462b9aaecf98e3a9b09a9b974cd
|
A Vietnamese edition of the MS MARCO passage ranking dataset.
It's a part of the mMARCO project: https://github.com/unicamp-dl/mMARCO
(Includes 10,000 Vietnamese samples, but not the full Vietnamese dataset.)
## Citation
```bibtex
@misc{bonifacio2021mmarco,
title={mMARCO: A Multilingual Version of MS MARCO Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Vitor Jeronymo and Hugo Queiroz Abonizio and Israel Campiotti and Marzieh Fadaee and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
eprint={2108.13897},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
|
ngoan/ms_marco_vietnamese
|
[
"license:apache-2.0",
"arxiv:2108.13897",
"region:us"
] |
2023-09-13T09:17:39+00:00
|
{"license": "apache-2.0", "dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "positive", "dtype": "string"}, {"name": "negative", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9920633, "num_examples": 10000}], "download_size": 3598309, "dataset_size": 9920633}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-14T04:52:46+00:00
|
[
"2108.13897"
] |
[] |
TAGS
#license-apache-2.0 #arxiv-2108.13897 #region-us
|
A Vietnamese edition of the MS MARCO passage ranking dataset.
It's a part of the mMARCO project: URL
(Includes 10,000 Vietnamese samples, but not the full Vietnamese dataset.)
'''bibtex
@misc{bonifacio2021mmarco,
title={mMARCO: A Multilingual Version of MS MARCO Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Vitor Jeronymo and Hugo Queiroz Abonizio and Israel Campiotti and Marzieh Fadaee and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
eprint={2108.13897},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
|
[] |
[
"TAGS\n#license-apache-2.0 #arxiv-2108.13897 #region-us \n"
] |
[
22
] |
[
"passage: TAGS\n#license-apache-2.0 #arxiv-2108.13897 #region-us \n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.