sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
081962f7dd5737ecef136107c074de6108c89ebf
# Dataset Card for "cc191d54" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/cc191d54
[ "region:us" ]
2023-09-11T17:42:05+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 186, "num_examples": 10}], "download_size": 1340, "dataset_size": 186}}
2023-09-11T17:42:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "cc191d54" More Information needed
[ "# Dataset Card for \"cc191d54\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"cc191d54\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"cc191d54\"\n\nMore Information needed" ]
9f298ba418b16e3b2f820f90e7a3913f22344df6
# Dataset Card for "5f55ccdd" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/5f55ccdd
[ "region:us" ]
2023-09-11T17:45:42+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 180, "num_examples": 10}], "download_size": 1337, "dataset_size": 180}}
2023-09-11T17:45:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "5f55ccdd" More Information needed
[ "# Dataset Card for \"5f55ccdd\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"5f55ccdd\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"5f55ccdd\"\n\nMore Information needed" ]
01ae6c2aa5857c4055b14e8064b6dbc9b5526be1
# Dataset Card for "3a222eb5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/3a222eb5
[ "region:us" ]
2023-09-11T17:51:25+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 180, "num_examples": 10}], "download_size": 1336, "dataset_size": 180}}
2023-09-11T17:51:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for "3a222eb5" More Information needed
[ "# Dataset Card for \"3a222eb5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"3a222eb5\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"3a222eb5\"\n\nMore Information needed" ]
04aabd0527ccb4c0a00a27f86b67e7f858d48963
# Dataset of irisviel_holy_grail/アイリスフィール〔天の衣〕/爱丽丝菲尔〔天之衣〕 (Fate/Grand Order) This is the dataset of irisviel_holy_grail/アイリスフィール〔天の衣〕/爱丽丝菲尔〔天之衣〕 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `long_hair, white_hair, red_eyes, breasts, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 467.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irisviel_holy_grail_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 313.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irisviel_holy_grail_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 932 | 556.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irisviel_holy_grail_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 430.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irisviel_holy_grail_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 932 | 729.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irisviel_holy_grail_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/irisviel_holy_grail_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, dress | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, dress, solo, smile, blush, chibi | | 2 | 30 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, pantyhose, thighhighs, thigh_boots, skirt, solo | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, coat, fur_hat, pantyhose, solo, thigh_boots, thighhighs, sitting, white_footwear | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, coat, fur_hat, solo | | 5 | 13 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, blush, closed_mouth, looking_at_viewer, smile, solo, parted_bangs, white_dress, bare_shoulders, huge_breasts, sidelocks, sleeveless_dress, thick_thighs, covered_nipples, curvy, pelvic_curtain, sideboob, wide_hips, standing, clothing_cutout, covered_navel, cowboy_shot | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, hetero, nipples, penis, solo_focus, 1boy, blush, cum_on_breasts, fellatio, bar_censor, cum_in_mouth, facial, nude, paizuri, pointless_censoring, tongue | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, cleavage, demon_girl, demon_horns, solo, upper_body, areola_slip, choker, collarbone, looking_at_viewer, bare_shoulders, detached_sleeves, arms_behind_head, arms_up, eyes_visible_through_hair, parted_lips, smile, very_long_hair | | 8 | 14 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, kimono, ponytail, hair_bow, solo, official_alternate_costume, one_eye_closed, smile | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1boy, hetero, nipples, penis, 1girl, blush, mosaic_censoring, vaginal, pussy, spread_legs, nude, open_mouth, solo_focus, sweat, sex_from_behind, thighhighs, girl_on_top, navel, straddling | | 10 | 12 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, completely_nude, sweat, nipples, open_mouth, 1boy, hetero, navel, parted_bangs, pussy, sex, thighs, vaginal, penis, solo_focus, spread_legs, bed_sheet, collarbone, mosaic_censoring, on_back, shiny_skin, sidelocks, body_blush, looking_at_viewer, on_bed, breasts_apart, eyes_visible_through_hair, forehead, frilled_pillow | | 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 2girls, blonde_hair, dress, ahoge, cleavage, medium_breasts | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | dress | bare_shoulders | smile | blush | chibi | pantyhose | thighhighs | thigh_boots | skirt | coat | fur_hat | sitting | white_footwear | closed_mouth | looking_at_viewer | parted_bangs | white_dress | huge_breasts | sidelocks | sleeveless_dress | thick_thighs | covered_nipples | curvy | pelvic_curtain | sideboob | wide_hips | standing | clothing_cutout | covered_navel | cowboy_shot | hetero | nipples | penis | solo_focus | 1boy | cum_on_breasts | fellatio | bar_censor | cum_in_mouth | facial | nude | paizuri | pointless_censoring | tongue | cleavage | demon_girl | demon_horns | upper_body | areola_slip | choker | collarbone | detached_sleeves | arms_behind_head | arms_up | eyes_visible_through_hair | parted_lips | very_long_hair | kimono | ponytail | hair_bow | official_alternate_costume | one_eye_closed | mosaic_censoring | vaginal | pussy | spread_legs | open_mouth | sweat | sex_from_behind | girl_on_top | navel | straddling | completely_nude | sex | thighs | bed_sheet | on_back | shiny_skin | body_blush | on_bed | breasts_apart | forehead | frilled_pillow | 2girls | blonde_hair | ahoge | medium_breasts | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------|:-----------------|:--------|:--------|:--------|:------------|:-------------|:--------------|:--------|:-------|:----------|:----------|:-----------------|:---------------|:--------------------|:---------------|:--------------|:---------------|:------------|:-------------------|:---------------|:------------------|:--------|:-----------------|:-----------|:------------|:-----------|:------------------|:----------------|:--------------|:---------|:----------|:--------|:-------------|:-------|:-----------------|:-----------|:-------------|:---------------|:---------|:-------|:----------|:----------------------|:---------|:-----------|:-------------|:--------------|:-------------|:--------------|:---------|:-------------|:-------------------|:-------------------|:----------|:----------------------------|:--------------|:-----------------|:---------|:-----------|:-----------|:-----------------------------|:-----------------|:-------------------|:----------|:--------|:--------------|:-------------|:--------|:------------------|:--------------|:--------|:-------------|:------------------|:------|:---------|:------------|:----------|:-------------|:-------------|:---------|:----------------|:-----------|:-----------------|:---------|:--------------|:--------|:-----------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 30 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | | | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 13 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | X | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 14 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 10 | 12 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | X | X | X | X | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | 11 | 5 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X |
CyberHarem/irisviel_holy_grail_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T17:52:51+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-13T04:46:28+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of irisviel\_holy\_grail/アイリスフィール〔天の衣〕/爱丽丝菲尔〔天之衣〕 (Fate/Grand Order) ============================================================================ This is the dataset of irisviel\_holy\_grail/アイリスフィール〔天の衣〕/爱丽丝菲尔〔天之衣〕 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'long\_hair, white\_hair, red\_eyes, breasts, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
880649e640c6b960cf8cdc1766c021303624ed3d
# Dataset Card for "b1b692c4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b1b692c4
[ "region:us" ]
2023-09-11T17:54:32+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 180, "num_examples": 10}], "download_size": 1340, "dataset_size": 180}}
2023-09-11T17:54:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for "b1b692c4" More Information needed
[ "# Dataset Card for \"b1b692c4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"b1b692c4\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"b1b692c4\"\n\nMore Information needed" ]
faf31781212c534d80977bff7c24b995c49074a1
# Dataset Card for "f17e5747" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/f17e5747
[ "region:us" ]
2023-09-11T17:58:57+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 178, "num_examples": 10}], "download_size": 1331, "dataset_size": 178}}
2023-09-11T17:58:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "f17e5747" More Information needed
[ "# Dataset Card for \"f17e5747\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"f17e5747\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"f17e5747\"\n\nMore Information needed" ]
b398de7bf776b70c938da386c3eb20e91db5a5fd
# Dataset Card for "534b0197" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/534b0197
[ "region:us" ]
2023-09-11T18:00:41+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 188, "num_examples": 10}], "download_size": 1337, "dataset_size": 188}}
2023-09-11T18:00:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for "534b0197" More Information needed
[ "# Dataset Card for \"534b0197\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"534b0197\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"534b0197\"\n\nMore Information needed" ]
d7adecd6c31c77554bec19055a945483b7b5120a
# Dataset Card for "e11a2ce6" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/e11a2ce6
[ "region:us" ]
2023-09-11T18:04:11+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 188, "num_examples": 10}], "download_size": 1341, "dataset_size": 188}}
2023-09-11T18:04:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "e11a2ce6" More Information needed
[ "# Dataset Card for \"e11a2ce6\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"e11a2ce6\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"e11a2ce6\"\n\nMore Information needed" ]
8fce2af658f74ff0322c1a572950ed262f8f5c27
# Dataset Card for "visdecode_ocr" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
martinsinnona/visdecode_ocr
[ "region:us" ]
2023-09-11T18:09:40+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4430219.0, "num_examples": 400}, {"name": "test", "num_bytes": 680418.0, "num_examples": 60}], "download_size": 5118908, "dataset_size": 5110637.0}}
2023-09-13T15:39:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for "visdecode_ocr" More Information needed
[ "# Dataset Card for \"visdecode_ocr\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"visdecode_ocr\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"visdecode_ocr\"\n\nMore Information needed" ]
d3e9b673cf1d4081b2a32f458517ca12842c5434
# Dataset of kagari/カガリ (Pokémon) This is the dataset of kagari/カガリ (Pokémon), containing 482 images and their tags. The core tags of this character are `purple_hair, short_hair, purple_eyes, horns, fake_horns, breasts, horned_headwear, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 482 | 484.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 482 | 300.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1047 | 589.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 482 | 441.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1047 | 800.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagari_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kagari_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, gloves, hoodie, looking_at_viewer, ribbed_sweater, solo, uniform, blush | | 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, gloves, holding_poke_ball, hood_up, looking_at_viewer, solo, eyelashes, poke_ball_(basic), ribbed_dress, closed_mouth, red_dress, sweater_dress, hand_up, ribbed_sweater, white_background, boots, red_footwear | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, gloves, hood_up, looking_at_viewer, eyelashes, solo, ribbed_sweater, closed_mouth, ribbed_dress, blush, upper_body, smile, sweater_dress | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, boots, gloves, pokemon_(creature), hood_up, hoodie, red_footwear, ribbed_sweater, sweater_dress, closed_mouth, full_body, looking_at_viewer, ribbed_dress, smile, standing, eyelashes, red_dress | | 4 | 20 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1boy, 1girl, blush, hetero, ribbed_sweater, solo_focus, vaginal, hoodie, pussy, gloves, penis, clothed_sex, open_mouth, large_breasts, spread_legs, uniform, bar_censor, cowgirl_position, girl_on_top, lying, nipples | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, blush, closed_mouth, large_breasts, long_sleeves, nipples, open_clothes, solo, hood_up, jacket, looking_at_viewer, ass_visible_through_thighs, cameltoe, simple_background, black_panties, black_thighhighs, breasts_out, cowboy_shot, gloves, heavy_breathing, hoodie, leotard, navel, piercing | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gloves | hoodie | looking_at_viewer | ribbed_sweater | solo | uniform | blush | holding_poke_ball | hood_up | eyelashes | poke_ball_(basic) | ribbed_dress | closed_mouth | red_dress | sweater_dress | hand_up | white_background | boots | red_footwear | upper_body | smile | pokemon_(creature) | full_body | standing | 1boy | hetero | solo_focus | vaginal | pussy | penis | clothed_sex | open_mouth | large_breasts | spread_legs | bar_censor | cowgirl_position | girl_on_top | lying | nipples | long_sleeves | open_clothes | jacket | ass_visible_through_thighs | cameltoe | simple_background | black_panties | black_thighhighs | breasts_out | cowboy_shot | heavy_breathing | leotard | navel | piercing | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------|:--------------------|:-----------------|:-------|:----------|:--------|:--------------------|:----------|:------------|:--------------------|:---------------|:---------------|:------------|:----------------|:----------|:-------------------|:--------|:---------------|:-------------|:--------|:---------------------|:------------|:-----------|:-------|:---------|:-------------|:----------|:--------|:--------|:--------------|:-------------|:----------------|:--------------|:-------------|:-------------------|:--------------|:--------|:----------|:---------------|:---------------|:---------|:-----------------------------|:-----------|:--------------------|:----------------|:-------------------|:--------------|:--------------|:------------------|:----------|:--------|:-----------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 11 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | X | | X | | X | X | | X | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | | | | | X | X | | X | X | X | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 20 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | | X | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/kagari_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T18:39:40+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T20:02:06+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kagari/カガリ (Pokémon) =============================== This is the dataset of kagari/カガリ (Pokémon), containing 482 images and their tags. The core tags of this character are 'purple\_hair, short\_hair, purple\_eyes, horns, fake\_horns, breasts, horned\_headwear, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
6eb1d813ee6076c69c39dd93aaccfb740bb22044
This is the same dataset as [`OxAISH-AL-LLM/wiki_toxic`](https://huggingface.co/datasets/OxAISH-AL-LLM/wiki_toxic). The only differences are 1. Addition of a unique identifier, `uid` 1. Addition of the indices, that is 3 columns with the embeddings of 3 different sentence-transformers - `all-mpnet-base-v2` - `multi-qa-mpnet-base-dot-v1` - `all-MiniLM-L12-v2` 1. Renaming of the `label` column to `labels` for easier compatibility with the transformers library
pietrolesci/wikitoxic
[ "task_categories:text-classification", "task_ids:hate-speech-detection", "annotations_creators:crowdsourced", "language_creators:found", "multilinguality:monolingual", "size_categories:100K<n<1M", "source_datasets:extended|other", "language:en", "license:cc0-1.0", "wikipedia", "toxicity", "toxic comments", "region:us" ]
2023-09-11T18:43:51+00:00
{"annotations_creators": ["crowdsourced"], "language_creators": ["found"], "language": ["en"], "license": ["cc0-1.0"], "multilinguality": ["monolingual"], "size_categories": ["100K<n<1M"], "source_datasets": ["extended|other"], "task_categories": ["text-classification"], "task_ids": ["hate-speech-detection"], "pretty_name": "Toxic Wikipedia Comments", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}, {"config_name": "embedding_all-MiniLM-L12-v2", "data_files": [{"split": "train", "path": "embedding_all-MiniLM-L12-v2/train-*"}, {"split": "validation", "path": "embedding_all-MiniLM-L12-v2/validation-*"}, {"split": "test", "path": "embedding_all-MiniLM-L12-v2/test-*"}]}, {"config_name": "embedding_all-mpnet-base-v2", "data_files": [{"split": "train", "path": "embedding_all-mpnet-base-v2/train-*"}, {"split": "validation", "path": "embedding_all-mpnet-base-v2/validation-*"}, {"split": "test", "path": "embedding_all-mpnet-base-v2/test-*"}]}, {"config_name": "embedding_multi-qa-mpnet-base-dot-v1", "data_files": [{"split": "train", "path": "embedding_multi-qa-mpnet-base-dot-v1/train-*"}, {"split": "validation", "path": "embedding_multi-qa-mpnet-base-dot-v1/validation-*"}, {"split": "test", "path": "embedding_multi-qa-mpnet-base-dot-v1/test-*"}]}], "dataset_info": [{"config_name": "default", "features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "labels", "dtype": {"class_label": {"names": {"0": "non", "1": "tox"}}}}, {"name": "uid", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 55430581, "num_examples": 127656}, {"name": "validation", "num_bytes": 13936861, "num_examples": 31915}, {"name": "test", "num_bytes": 27474227, "num_examples": 63978}], "download_size": 62548640, "dataset_size": 96841669}, {"config_name": "embedding_all-MiniLM-L12-v2", "features": [{"name": "uid", "dtype": "int64"}, {"name": "embedding_all-MiniLM-L12-v2", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 197611488, "num_examples": 127656}, {"name": "validation", "num_bytes": 49404420, "num_examples": 31915}, {"name": "test", "num_bytes": 99037944, "num_examples": 63978}], "download_size": 484421377, "dataset_size": 346053852}, {"config_name": "embedding_all-mpnet-base-v2", "features": [{"name": "uid", "dtype": "int64"}, {"name": "embedding_all-mpnet-base-v2", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 393691104, "num_examples": 127656}, {"name": "validation", "num_bytes": 98425860, "num_examples": 31915}, {"name": "test", "num_bytes": 197308152, "num_examples": 63978}], "download_size": 827919212, "dataset_size": 689425116}, {"config_name": "embedding_multi-qa-mpnet-base-dot-v1", "features": [{"name": "uid", "dtype": "int64"}, {"name": "embedding_multi-qa-mpnet-base-dot-v1", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 393691104, "num_examples": 127656}, {"name": "validation", "num_bytes": 98425860, "num_examples": 31915}, {"name": "test", "num_bytes": 197308152, "num_examples": 63978}], "download_size": 827907964, "dataset_size": 689425116}], "tags": ["wikipedia", "toxicity", "toxic comments"]}
2023-09-13T11:03:54+00:00
[]
[ "en" ]
TAGS #task_categories-text-classification #task_ids-hate-speech-detection #annotations_creators-crowdsourced #language_creators-found #multilinguality-monolingual #size_categories-100K<n<1M #source_datasets-extended|other #language-English #license-cc0-1.0 #wikipedia #toxicity #toxic comments #region-us
This is the same dataset as 'OxAISH-AL-LLM/wiki_toxic'. The only differences are 1. Addition of a unique identifier, 'uid' 1. Addition of the indices, that is 3 columns with the embeddings of 3 different sentence-transformers - 'all-mpnet-base-v2' - 'multi-qa-mpnet-base-dot-v1' - 'all-MiniLM-L12-v2' 1. Renaming of the 'label' column to 'labels' for easier compatibility with the transformers library
[]
[ "TAGS\n#task_categories-text-classification #task_ids-hate-speech-detection #annotations_creators-crowdsourced #language_creators-found #multilinguality-monolingual #size_categories-100K<n<1M #source_datasets-extended|other #language-English #license-cc0-1.0 #wikipedia #toxicity #toxic comments #region-us \n" ]
[ 104 ]
[ "passage: TAGS\n#task_categories-text-classification #task_ids-hate-speech-detection #annotations_creators-crowdsourced #language_creators-found #multilinguality-monolingual #size_categories-100K<n<1M #source_datasets-extended|other #language-English #license-cc0-1.0 #wikipedia #toxicity #toxic comments #region-us \n" ]
45aa240ea6b1d7976cd400f10a15510d0befc56c
# Dataset Card for "metal-python-synthetic-explanations-gpt4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lum-ai/metal-python-synthetic-explanations-gpt4
[ "region:us" ]
2023-09-11T18:57:08+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "chunk_id", "dtype": "string"}, {"name": "model_name", "dtype": "string"}, {"name": "temperature", "dtype": "int64"}, {"name": "max_tokens", "dtype": "float64"}, {"name": "use_raw_code", "dtype": "bool"}, {"name": "description", "dtype": "string"}, {"name": "created_at", "dtype": "timestamp[ns]"}, {"name": "raw_text", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "code", "dtype": "string"}, {"name": "kind", "dtype": "string"}, {"name": "start_text", "dtype": "int64"}, {"name": "stop_text", "dtype": "int64"}, {"name": "start_code", "dtype": "int64"}, {"name": "stop_code", "dtype": "int64"}, {"name": "domain", "dtype": "string"}, {"name": "full_name", "dtype": "string"}, {"name": "license", "struct": [{"name": "key", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "node_id", "dtype": "string"}, {"name": "spdx_id", "dtype": "string"}, {"name": "url", "dtype": "string"}]}, {"name": "stargazers_count", "dtype": "int64"}, {"name": "filename", "dtype": "string"}, {"name": "chunk_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2896865017, "num_examples": 313681}, {"name": "validation", "num_bytes": 173850658, "num_examples": 18952}, {"name": "test", "num_bytes": 339322116, "num_examples": 36740}], "download_size": 76607138, "dataset_size": 3410037791}}
2023-09-15T16:08:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for "metal-python-synthetic-explanations-gpt4" More Information needed
[ "# Dataset Card for \"metal-python-synthetic-explanations-gpt4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"metal-python-synthetic-explanations-gpt4\"\n\nMore Information needed" ]
[ 6, 26 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"metal-python-synthetic-explanations-gpt4\"\n\nMore Information needed" ]
55572eef0b7005f3b8fdc829e003f0bfa51e42ba
# Dataset Card for [cedr] ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [GitHub](https://github.com/sag111/CEDR) - **Repository:** [GitHub](https://github.com/sag111/CEDR) - **Paper:** [ScienceDirect](https://www.sciencedirect.com/science/article/pii/S1877050921013247) - **Leaderboard:** - **Point of Contact:** [@sag111](mailto:[email protected]) ### Dataset Summary The Corpus for Emotions Detecting in Russian-language text sentences of different social sources (CEDR) contains 9410 comments labeled for 5 emotion categories (joy, sadness, surprise, fear, and anger). Here are 2 dataset configurations: - "main" - contains "text", "labels", and "source" features; - "enriched" - includes all "main" features and "sentences". Dataset with predefined train/test splits. ### Supported Tasks and Leaderboards This dataset is intended for multi-label emotion classification. ### Languages The data is in Russian. ## Dataset Structure ### Data Instances Each instance is a text sentence in Russian from several sources with one or more emotion annotations (or no emotion at all). An example for an instance from the dataset is shown below: ``` { 'text': 'Забавно как люди в возрасте удивляются входящим звонкам на мобильник)', 'labels': [0], 'source': 'twitter', 'sentences': [ [ {'forma': 'Забавно', 'lemma': 'Забавно'}, {'forma': 'как', 'lemma': 'как'}, {'forma': 'люди', 'lemma': 'человек'}, {'forma': 'в', 'lemma': 'в'}, {'forma': 'возрасте', 'lemma': 'возраст'}, {'forma': 'удивляются', 'lemma': 'удивляться'}, {'forma': 'входящим', 'lemma': 'входить'}, {'forma': 'звонкам', 'lemma': 'звонок'}, {'forma': 'на', 'lemma': 'на'}, {'forma': 'мобильник', 'lemma': 'мобильник'}, {'forma': ')', 'lemma': ')'} ] ] } ``` Emotion label codes: {0: "joy", 1: "sadness", 2: "surprise", 3: "fear", 4: "anger"} ### Data Fields The main configuration includes: - text: the text of the sentence; - labels: the emotion annotations; - source: the tag name of the corresponding source In addition to the above, the raw data includes: - sentences: text tokenized and lemmatized with [udpipe](https://ufal.mff.cuni.cz/udpipe) - 'forma': the original word form; - 'lemma': the lemma of this word ### Data Splits The dataset includes a set of train/test splits. with 7528, and 1882 examples respectively. ## Dataset Creation ### Curation Rationale The formed dataset of examples consists of sentences in Russian from several sources (blogs, microblogs, news), which allows creating methods to analyse various types of texts. The created methodology for building the dataset based on applying a crowdsourcing service can be used to expand the number of examples to improve the accuracy of supervised classifiers. ### Source Data #### Initial Data Collection and Normalization Data was collected from several sources: posts of the Live Journal social network, texts of the online news agency Lenta.ru, and Twitter microblog posts. Only those sentences were selected that contained marker words from the dictionary of [the emotive vocabulary of the Russian language](http://lexrus.ru/default.aspx?p=2876). The authors manually formed a list of marker words for each emotion by choosing words from different categories of the dictionary. In total, 3069 sentences were selected from LiveJournal posts, 2851 sentences from Lenta.Ru, and 3490 sentencesfrom Twitter. After selection, sentences were offered to annotators for labeling. #### Who are the source language producers? Russian-speaking LiveJournal and Tweeter users, and authors of news articles on the site lenta.ru. ### Annotations #### Annotation process Annotating sentences with labels of their emotions was performed with the help of [a crowdsourcing platform](https://yandex.ru/support/toloka/index.html?lang=en). The annotators’ task was: “What emotions did the author express in the sentence?”. The annotators were allowed to put an arbitrary number of the following emotion labels: "joy", "sadness", "anger", "fear", and "surprise". If the accuracy of an annotator on the control sentences (including the trial run) became less than 70%, or if the accuracy was less than 66% over the last six control samples, the annotator was dismissed. Sentences were split into tasks and assigned to annotators so that each sentence was annotated at least three times. A label of a specific emotion was assigned to a sentence if put by more than half of the annotators. #### Who are the annotators? Only those of the 30% of the best-performing active users (by the platform’s internal rating) who spoke Russian and were over 18 years old were allowed into the annotation process. Moreover, before a platform user could be employed as an annotator, they underwent a training task, after which they were to mark 25 trial samples with more than 80% agreement compared to the annotation that the authors had performed themselves. ### Personal and Sensitive Information The text of the sentences may contain profanity. ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators Researchers at AI technology lab at NRC "Kurchatov Institute". See the author [list](https://www.sciencedirect.com/science/article/pii/S1877050921013247). ### Licensing Information The GitHub repository which houses this dataset has an Apache License 2.0. ### Citation Information If you have found our results helpful in your work, feel free to cite our publication. This is an updated version of the dataset, the collection and preparation of which is described here: ``` @article{sboev2021data, title={Data-Driven Model for Emotion Detection in Russian Texts}, author={Sboev, Alexander and Naumov, Aleksandr and Rybka, Roman}, journal={Procedia Computer Science}, volume={190}, pages={637--642}, year={2021}, publisher={Elsevier} } ``` ### Contributions Thanks to [@naumov-al](https://github.com/naumov-al) for adding this dataset.
asoria/draft-list-column
[ "task_categories:text-classification", "task_ids:sentiment-classification", "task_ids:multi-label-classification", "annotations_creators:crowdsourced", "language_creators:found", "multilinguality:monolingual", "size_categories:1K<n<10K", "source_datasets:original", "language:ru", "license:apache-2.0", "emotion-classification", "region:us" ]
2023-09-11T19:03:01+00:00
{"annotations_creators": ["crowdsourced"], "language_creators": ["found"], "language": ["ru"], "license": ["apache-2.0"], "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "source_datasets": ["original"], "task_categories": ["text-classification"], "task_ids": ["sentiment-classification", "multi-label-classification"], "pretty_name": "The Corpus for Emotions Detecting in Russian-language text sentences (CEDR)", "tags": ["emotion-classification"], "dataset_info": [{"config_name": "main", "features": [{"name": "text", "dtype": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "joy", "1": "sadness", "2": "surprise", "3": "fear", "4": "anger"}}}}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1418355, "num_examples": 7528}, {"name": "test", "num_bytes": 350275, "num_examples": 1882}], "download_size": 693026, "dataset_size": 1768630}, {"config_name": "enriched", "features": [{"name": "text", "dtype": "string"}, {"name": "labels", "sequence": {"class_label": {"names": {"0": "joy", "1": "sadness", "2": "surprise", "3": "fear", "4": "anger"}}}}, {"name": "source", "dtype": "string"}, {"name": "sentences", "list": {"list": [{"name": "forma", "dtype": "string"}, {"name": "lemma", "dtype": "string"}]}}], "splits": [{"name": "train", "num_bytes": 4792366, "num_examples": 7528}, {"name": "test", "num_bytes": 1182343, "num_examples": 1882}], "download_size": 1822522, "dataset_size": 5974709}]}
2023-09-11T19:04:38+00:00
[]
[ "ru" ]
TAGS #task_categories-text-classification #task_ids-sentiment-classification #task_ids-multi-label-classification #annotations_creators-crowdsourced #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Russian #license-apache-2.0 #emotion-classification #region-us
# Dataset Card for [cedr] ## Table of Contents - Table of Contents - Dataset Description - Dataset Summary - Supported Tasks and Leaderboards - Languages - Dataset Structure - Data Instances - Data Fields - Data Splits - Dataset Creation - Curation Rationale - Source Data - Annotations - Personal and Sensitive Information - Considerations for Using the Data - Social Impact of Dataset - Discussion of Biases - Other Known Limitations - Additional Information - Dataset Curators - Licensing Information - Citation Information - Contributions ## Dataset Description - Homepage: GitHub - Repository: GitHub - Paper: ScienceDirect - Leaderboard: - Point of Contact: @sag111 ### Dataset Summary The Corpus for Emotions Detecting in Russian-language text sentences of different social sources (CEDR) contains 9410 comments labeled for 5 emotion categories (joy, sadness, surprise, fear, and anger). Here are 2 dataset configurations: - "main" - contains "text", "labels", and "source" features; - "enriched" - includes all "main" features and "sentences". Dataset with predefined train/test splits. ### Supported Tasks and Leaderboards This dataset is intended for multi-label emotion classification. ### Languages The data is in Russian. ## Dataset Structure ### Data Instances Each instance is a text sentence in Russian from several sources with one or more emotion annotations (or no emotion at all). An example for an instance from the dataset is shown below: Emotion label codes: {0: "joy", 1: "sadness", 2: "surprise", 3: "fear", 4: "anger"} ### Data Fields The main configuration includes: - text: the text of the sentence; - labels: the emotion annotations; - source: the tag name of the corresponding source In addition to the above, the raw data includes: - sentences: text tokenized and lemmatized with udpipe - 'forma': the original word form; - 'lemma': the lemma of this word ### Data Splits The dataset includes a set of train/test splits. with 7528, and 1882 examples respectively. ## Dataset Creation ### Curation Rationale The formed dataset of examples consists of sentences in Russian from several sources (blogs, microblogs, news), which allows creating methods to analyse various types of texts. The created methodology for building the dataset based on applying a crowdsourcing service can be used to expand the number of examples to improve the accuracy of supervised classifiers. ### Source Data #### Initial Data Collection and Normalization Data was collected from several sources: posts of the Live Journal social network, texts of the online news agency URL, and Twitter microblog posts. Only those sentences were selected that contained marker words from the dictionary of the emotive vocabulary of the Russian language. The authors manually formed a list of marker words for each emotion by choosing words from different categories of the dictionary. In total, 3069 sentences were selected from LiveJournal posts, 2851 sentences from Lenta.Ru, and 3490 sentencesfrom Twitter. After selection, sentences were offered to annotators for labeling. #### Who are the source language producers? Russian-speaking LiveJournal and Tweeter users, and authors of news articles on the site URL. ### Annotations #### Annotation process Annotating sentences with labels of their emotions was performed with the help of a crowdsourcing platform. The annotators’ task was: “What emotions did the author express in the sentence?”. The annotators were allowed to put an arbitrary number of the following emotion labels: "joy", "sadness", "anger", "fear", and "surprise". If the accuracy of an annotator on the control sentences (including the trial run) became less than 70%, or if the accuracy was less than 66% over the last six control samples, the annotator was dismissed. Sentences were split into tasks and assigned to annotators so that each sentence was annotated at least three times. A label of a specific emotion was assigned to a sentence if put by more than half of the annotators. #### Who are the annotators? Only those of the 30% of the best-performing active users (by the platform’s internal rating) who spoke Russian and were over 18 years old were allowed into the annotation process. Moreover, before a platform user could be employed as an annotator, they underwent a training task, after which they were to mark 25 trial samples with more than 80% agreement compared to the annotation that the authors had performed themselves. ### Personal and Sensitive Information The text of the sentences may contain profanity. ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators Researchers at AI technology lab at NRC "Kurchatov Institute". See the author list. ### Licensing Information The GitHub repository which houses this dataset has an Apache License 2.0. If you have found our results helpful in your work, feel free to cite our publication. This is an updated version of the dataset, the collection and preparation of which is described here: ### Contributions Thanks to @naumov-al for adding this dataset.
[ "# Dataset Card for [cedr]", "## Table of Contents\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions", "## Dataset Description\n\n- Homepage: GitHub\n- Repository: GitHub\n- Paper: ScienceDirect\n- Leaderboard:\n- Point of Contact: @sag111", "### Dataset Summary\n\nThe Corpus for Emotions Detecting in Russian-language text sentences of different social sources (CEDR) contains 9410 comments labeled for 5 emotion categories (joy, sadness, surprise, fear, and anger). \n\nHere are 2 dataset configurations:\n- \"main\" - contains \"text\", \"labels\", and \"source\" features;\n- \"enriched\" - includes all \"main\" features and \"sentences\".\n\nDataset with predefined train/test splits.", "### Supported Tasks and Leaderboards\n\nThis dataset is intended for multi-label emotion classification.", "### Languages\n\nThe data is in Russian.", "## Dataset Structure", "### Data Instances\n\nEach instance is a text sentence in Russian from several sources with one or more emotion annotations (or no emotion at all).\n\nAn example for an instance from the dataset is shown below:\n\n\nEmotion label codes: {0: \"joy\", 1: \"sadness\", 2: \"surprise\", 3: \"fear\", 4: \"anger\"}", "### Data Fields\n\nThe main configuration includes:\n- text: the text of the sentence;\n- labels: the emotion annotations;\n- source: the tag name of the corresponding source\n\nIn addition to the above, the raw data includes:\n- sentences: text tokenized and lemmatized with udpipe\n - 'forma': the original word form;\n - 'lemma': the lemma of this word", "### Data Splits\n\nThe dataset includes a set of train/test splits. \nwith 7528, and 1882 examples respectively.", "## Dataset Creation", "### Curation Rationale\n\nThe formed dataset of examples consists of sentences in Russian from several sources (blogs, microblogs, news), which allows creating methods to analyse various types of texts. The created methodology for building the dataset based on applying a crowdsourcing service can be used to expand the number of examples to improve the accuracy of supervised classifiers.", "### Source Data", "#### Initial Data Collection and Normalization\n\nData was collected from several sources: posts of the Live Journal social network, texts of the online news agency URL, and Twitter microblog posts.\n\nOnly those sentences were selected that contained marker words from the dictionary of the emotive vocabulary of the Russian language. The authors manually formed a list of marker words for each emotion by choosing words from different categories of the dictionary.\n\nIn total, 3069 sentences were selected from LiveJournal posts, 2851 sentences from Lenta.Ru, and 3490 sentencesfrom Twitter. After selection, sentences were offered to annotators for labeling.", "#### Who are the source language producers?\n\nRussian-speaking LiveJournal and Tweeter users, and authors of news articles on the site URL.", "### Annotations", "#### Annotation process\n\nAnnotating sentences with labels of their emotions was performed with the help of a crowdsourcing platform.\n\nThe annotators’ task was: “What emotions did the author express in the sentence?”. The annotators were allowed to put an arbitrary number of the following emotion labels: \"joy\", \"sadness\", \"anger\", \"fear\", and \"surprise\".\n\nIf the accuracy of an annotator on the control sentences (including the trial run) became less than 70%, or if the accuracy was less than 66% over the last six control samples, the annotator was dismissed. \n\nSentences were split into tasks and assigned to annotators so that each sentence was annotated at least three times. A label of a specific emotion was assigned to a sentence if put by more than half of the annotators.", "#### Who are the annotators?\n\nOnly those of the 30% of the best-performing active users (by the platform’s internal rating) who spoke Russian and were over 18 years old were allowed into the annotation process. Moreover, before a platform user could be employed as an annotator, they underwent a training task, after which they were to mark 25 trial samples with more than 80% agreement compared to the annotation that the authors had performed themselves.", "### Personal and Sensitive Information\n\nThe text of the sentences may contain profanity.", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators\n\nResearchers at AI technology lab at NRC \"Kurchatov Institute\". See the author list.", "### Licensing Information\n\nThe GitHub repository which houses this dataset has an Apache License 2.0.\n\n\nIf you have found our results helpful in your work, feel free to cite our publication. This is an updated version of the dataset, the collection and preparation of which is described here:", "### Contributions\n\nThanks to @naumov-al for adding this dataset." ]
[ "TAGS\n#task_categories-text-classification #task_ids-sentiment-classification #task_ids-multi-label-classification #annotations_creators-crowdsourced #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Russian #license-apache-2.0 #emotion-classification #region-us \n", "# Dataset Card for [cedr]", "## Table of Contents\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions", "## Dataset Description\n\n- Homepage: GitHub\n- Repository: GitHub\n- Paper: ScienceDirect\n- Leaderboard:\n- Point of Contact: @sag111", "### Dataset Summary\n\nThe Corpus for Emotions Detecting in Russian-language text sentences of different social sources (CEDR) contains 9410 comments labeled for 5 emotion categories (joy, sadness, surprise, fear, and anger). \n\nHere are 2 dataset configurations:\n- \"main\" - contains \"text\", \"labels\", and \"source\" features;\n- \"enriched\" - includes all \"main\" features and \"sentences\".\n\nDataset with predefined train/test splits.", "### Supported Tasks and Leaderboards\n\nThis dataset is intended for multi-label emotion classification.", "### Languages\n\nThe data is in Russian.", "## Dataset Structure", "### Data Instances\n\nEach instance is a text sentence in Russian from several sources with one or more emotion annotations (or no emotion at all).\n\nAn example for an instance from the dataset is shown below:\n\n\nEmotion label codes: {0: \"joy\", 1: \"sadness\", 2: \"surprise\", 3: \"fear\", 4: \"anger\"}", "### Data Fields\n\nThe main configuration includes:\n- text: the text of the sentence;\n- labels: the emotion annotations;\n- source: the tag name of the corresponding source\n\nIn addition to the above, the raw data includes:\n- sentences: text tokenized and lemmatized with udpipe\n - 'forma': the original word form;\n - 'lemma': the lemma of this word", "### Data Splits\n\nThe dataset includes a set of train/test splits. \nwith 7528, and 1882 examples respectively.", "## Dataset Creation", "### Curation Rationale\n\nThe formed dataset of examples consists of sentences in Russian from several sources (blogs, microblogs, news), which allows creating methods to analyse various types of texts. The created methodology for building the dataset based on applying a crowdsourcing service can be used to expand the number of examples to improve the accuracy of supervised classifiers.", "### Source Data", "#### Initial Data Collection and Normalization\n\nData was collected from several sources: posts of the Live Journal social network, texts of the online news agency URL, and Twitter microblog posts.\n\nOnly those sentences were selected that contained marker words from the dictionary of the emotive vocabulary of the Russian language. The authors manually formed a list of marker words for each emotion by choosing words from different categories of the dictionary.\n\nIn total, 3069 sentences were selected from LiveJournal posts, 2851 sentences from Lenta.Ru, and 3490 sentencesfrom Twitter. After selection, sentences were offered to annotators for labeling.", "#### Who are the source language producers?\n\nRussian-speaking LiveJournal and Tweeter users, and authors of news articles on the site URL.", "### Annotations", "#### Annotation process\n\nAnnotating sentences with labels of their emotions was performed with the help of a crowdsourcing platform.\n\nThe annotators’ task was: “What emotions did the author express in the sentence?”. The annotators were allowed to put an arbitrary number of the following emotion labels: \"joy\", \"sadness\", \"anger\", \"fear\", and \"surprise\".\n\nIf the accuracy of an annotator on the control sentences (including the trial run) became less than 70%, or if the accuracy was less than 66% over the last six control samples, the annotator was dismissed. \n\nSentences were split into tasks and assigned to annotators so that each sentence was annotated at least three times. A label of a specific emotion was assigned to a sentence if put by more than half of the annotators.", "#### Who are the annotators?\n\nOnly those of the 30% of the best-performing active users (by the platform’s internal rating) who spoke Russian and were over 18 years old were allowed into the annotation process. Moreover, before a platform user could be employed as an annotator, they underwent a training task, after which they were to mark 25 trial samples with more than 80% agreement compared to the annotation that the authors had performed themselves.", "### Personal and Sensitive Information\n\nThe text of the sentences may contain profanity.", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators\n\nResearchers at AI technology lab at NRC \"Kurchatov Institute\". See the author list.", "### Licensing Information\n\nThe GitHub repository which houses this dataset has an Apache License 2.0.\n\n\nIf you have found our results helpful in your work, feel free to cite our publication. This is an updated version of the dataset, the collection and preparation of which is described here:", "### Contributions\n\nThanks to @naumov-al for adding this dataset." ]
[ 108, 9, 125, 35, 114, 23, 10, 6, 76, 89, 29, 5, 88, 4, 143, 33, 5, 192, 102, 20, 8, 7, 8, 7, 5, 26, 64, 18 ]
[ "passage: TAGS\n#task_categories-text-classification #task_ids-sentiment-classification #task_ids-multi-label-classification #annotations_creators-crowdsourced #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Russian #license-apache-2.0 #emotion-classification #region-us \n# Dataset Card for [cedr]## Table of Contents\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions## Dataset Description\n\n- Homepage: GitHub\n- Repository: GitHub\n- Paper: ScienceDirect\n- Leaderboard:\n- Point of Contact: @sag111### Dataset Summary\n\nThe Corpus for Emotions Detecting in Russian-language text sentences of different social sources (CEDR) contains 9410 comments labeled for 5 emotion categories (joy, sadness, surprise, fear, and anger). \n\nHere are 2 dataset configurations:\n- \"main\" - contains \"text\", \"labels\", and \"source\" features;\n- \"enriched\" - includes all \"main\" features and \"sentences\".\n\nDataset with predefined train/test splits.### Supported Tasks and Leaderboards\n\nThis dataset is intended for multi-label emotion classification.### Languages\n\nThe data is in Russian.## Dataset Structure### Data Instances\n\nEach instance is a text sentence in Russian from several sources with one or more emotion annotations (or no emotion at all).\n\nAn example for an instance from the dataset is shown below:\n\n\nEmotion label codes: {0: \"joy\", 1: \"sadness\", 2: \"surprise\", 3: \"fear\", 4: \"anger\"}", "passage: ### Data Fields\n\nThe main configuration includes:\n- text: the text of the sentence;\n- labels: the emotion annotations;\n- source: the tag name of the corresponding source\n\nIn addition to the above, the raw data includes:\n- sentences: text tokenized and lemmatized with udpipe\n - 'forma': the original word form;\n - 'lemma': the lemma of this word### Data Splits\n\nThe dataset includes a set of train/test splits. \nwith 7528, and 1882 examples respectively.## Dataset Creation### Curation Rationale\n\nThe formed dataset of examples consists of sentences in Russian from several sources (blogs, microblogs, news), which allows creating methods to analyse various types of texts. The created methodology for building the dataset based on applying a crowdsourcing service can be used to expand the number of examples to improve the accuracy of supervised classifiers.### Source Data#### Initial Data Collection and Normalization\n\nData was collected from several sources: posts of the Live Journal social network, texts of the online news agency URL, and Twitter microblog posts.\n\nOnly those sentences were selected that contained marker words from the dictionary of the emotive vocabulary of the Russian language. The authors manually formed a list of marker words for each emotion by choosing words from different categories of the dictionary.\n\nIn total, 3069 sentences were selected from LiveJournal posts, 2851 sentences from Lenta.Ru, and 3490 sentencesfrom Twitter. After selection, sentences were offered to annotators for labeling.#### Who are the source language producers?\n\nRussian-speaking LiveJournal and Tweeter users, and authors of news articles on the site URL.### Annotations#### Annotation process\n\nAnnotating sentences with labels of their emotions was performed with the help of a crowdsourcing platform.\n\nThe annotators’ task was: “What emotions did the author express in the sentence?”. The annotators were allowed to put an arbitrary number of the following emotion labels: \"joy\", \"sadness\", \"anger\", \"fear\", and \"surprise\".\n\nIf the accuracy of an annotator on the control sentences (including the trial run) became less than 70%, or if the accuracy was less than 66% over the last six control samples, the annotator was dismissed. \n\nSentences were split into tasks and assigned to annotators so that each sentence was annotated at least three times. A label of a specific emotion was assigned to a sentence if put by more than half of the annotators." ]
5a1b96bb4cff7976d4b4b8eeaefbcfe6119aeea1
# Dataset of sunazuka_akira/砂塚あきら (THE iDOLM@STER: Cinderella Girls) This is the dataset of sunazuka_akira/砂塚あきら (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are `mole, mole_under_eye, bangs, long_hair, brown_eyes, hair_between_eyes, twintails, brown_hair, black_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 668.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sunazuka_akira_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 361.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sunazuka_akira_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1230 | 802.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sunazuka_akira_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 582.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sunazuka_akira_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1230 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/sunazuka_akira_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sunazuka_akira_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 46 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, green_necktie, mask_pull, sharp_teeth, solo, surgical_mask, striped_necktie, white_shirt, black_jacket, collared_shirt, upper_body, looking_at_viewer, open_mouth, long_sleeves, hooded_jacket, simple_background, blush, hood_down, white_background, hand_up, diagonal_stripes, drawstring, sleeves_past_wrists | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, backpack, black_jacket, collared_shirt, cowboy_shot, long_sleeves, looking_at_viewer, mask_pull, open_mouth, pleated_skirt, sharp_teeth, solo, striped_necktie, surgical_mask, white_shirt, green_necktie, hooded_jacket, simple_background, hands_in_pockets, hood_down, white_background, hand_in_pocket | | 2 | 44 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, two_side_up, bare_shoulders, looking_at_viewer, off_shoulder, blush, sleeveless_turtleneck, beret, ribbed_sweater, sharp_teeth, black_headwear, earrings, crop_top, long_sleeves, open_jacket, black_sweater, white_jacket, belt, medium_breasts, wavy_hair, simple_background, midriff, white_background, black_skirt, grin, open_mouth, upper_body | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, blush, looking_at_viewer, medium_breasts, open_mouth, ribbed_sweater, sharp_teeth, simple_background, sleeveless_sweater, sleeveless_turtleneck, solo, upper_body, black_sweater, turtleneck_sweater, :o, bare_arms, crop_top, cropped_sweater, midriff, grey_background, hand_up, mask_pull, navel, surgical_mask, two_side_up, white_background | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, blush, earrings, long_sleeves, looking_at_viewer, nail_polish, off_shoulder, solo, two_side_up, white_jacket, cleavage, fur_trim, holding_food, pink_dress, sharp_teeth, chocolate_bar, collarbone, heart, medium_breasts, plaid_dress, white_choker, blue_background, box, character_name, grin, red_dress, red_nails, sleeveless_dress, zipper_pull_tab | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, black_leotard, blush, long_sleeves, looking_at_viewer, ribbed_leotard, sharp_teeth, solo, two_side_up, white_jacket, belt, beret, black_gloves, fingerless_gloves, nail_polish, off_shoulder, black_headwear, black_nails, black_skirt, cowboy_shot, cropped_jacket, leotard_under_clothes, open_mouth, sleeveless_turtleneck, smartphone, thigh_strap, zipper_pull_tab, breasts, grin, heart, highleg_leotard, holding_phone, hoop_earrings, open_jacket, sneakers | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, blush, hair_ribbon, obi, solo, wide_sleeves, blue_kimono, blue_ribbon, floral_print, nail_polish, print_kimono, upper_body, :d, long_sleeves, looking_at_viewer, open_mouth, sharp_teeth, black_nails, blurry, holding_cup, holding_spoon, ice_cream, yukata | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_necktie | mask_pull | sharp_teeth | solo | surgical_mask | striped_necktie | white_shirt | black_jacket | collared_shirt | upper_body | looking_at_viewer | open_mouth | long_sleeves | hooded_jacket | simple_background | blush | hood_down | white_background | hand_up | diagonal_stripes | drawstring | sleeves_past_wrists | backpack | cowboy_shot | pleated_skirt | hands_in_pockets | hand_in_pocket | two_side_up | bare_shoulders | off_shoulder | sleeveless_turtleneck | beret | ribbed_sweater | black_headwear | earrings | crop_top | open_jacket | black_sweater | white_jacket | belt | medium_breasts | wavy_hair | midriff | black_skirt | grin | sleeveless_sweater | turtleneck_sweater | :o | bare_arms | cropped_sweater | grey_background | navel | nail_polish | cleavage | fur_trim | holding_food | pink_dress | chocolate_bar | collarbone | heart | plaid_dress | white_choker | blue_background | box | character_name | red_dress | red_nails | sleeveless_dress | zipper_pull_tab | black_leotard | ribbed_leotard | black_gloves | fingerless_gloves | black_nails | cropped_jacket | leotard_under_clothes | smartphone | thigh_strap | breasts | highleg_leotard | holding_phone | hoop_earrings | sneakers | hair_ribbon | obi | wide_sleeves | blue_kimono | blue_ribbon | floral_print | print_kimono | :d | blurry | holding_cup | holding_spoon | ice_cream | yukata | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:------------|:--------------|:-------|:----------------|:------------------|:--------------|:---------------|:-----------------|:-------------|:--------------------|:-------------|:---------------|:----------------|:--------------------|:--------|:------------|:-------------------|:----------|:-------------------|:-------------|:----------------------|:-----------|:--------------|:----------------|:-------------------|:-----------------|:--------------|:-----------------|:---------------|:------------------------|:--------|:-----------------|:-----------------|:-----------|:-----------|:--------------|:----------------|:---------------|:-------|:-----------------|:------------|:----------|:--------------|:-------|:---------------------|:---------------------|:-----|:------------|:------------------|:------------------|:--------|:--------------|:-----------|:-----------|:---------------|:-------------|:----------------|:-------------|:--------|:--------------|:---------------|:------------------|:------|:-----------------|:------------|:------------|:-------------------|:------------------|:----------------|:-----------------|:---------------|:--------------------|:--------------|:-----------------|:------------------------|:-------------|:--------------|:----------|:------------------|:----------------|:----------------|:-----------|:--------------|:------|:---------------|:--------------|:--------------|:---------------|:---------------|:-----|:---------|:--------------|:----------------|:------------|:---------| | 0 | 46 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | | X | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 44 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | | | | | | X | X | X | X | | X | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | X | X | | | | | X | X | X | | | X | X | | X | X | | | | | | | | | X | X | | X | | X | | | X | | X | | | X | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | X | | | | | | | X | | X | | | X | | | | | | | | | | | | X | X | X | | | | | X | | | | X | | X | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | X | | | | | | | X | X | X | | | X | | | | | | | | X | | | | X | X | X | X | X | | X | | | X | | X | X | | | | X | X | | | | | | | | X | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | X | | | | | | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/sunazuka_akira_idolmastercinderellagirls
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T19:29:57+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T15:32:21+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of sunazuka\_akira/砂塚あきら (THE iDOLM@STER: Cinderella Girls) =================================================================== This is the dataset of sunazuka\_akira/砂塚あきら (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are 'mole, mole\_under\_eye, bangs, long\_hair, brown\_eyes, hair\_between\_eyes, twintails, brown\_hair, black\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
8805c59eff98b1f09e6791cd1328a9583bc4400b
# Dataset of shuten_douji/酒呑童子/酒吞童子 (Fate/Grand Order) This is the dataset of shuten_douji/酒呑童子/酒吞童子 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `horns, oni_horns, skin-covered_horns, bangs, purple_hair, short_hair, purple_eyes, breasts, small_breasts, bob_cut, eyeliner, short_eyebrows`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 771.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shuten_douji_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 402.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shuten_douji_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1247 | 871.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shuten_douji_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 661.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shuten_douji_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1247 | 1.24 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shuten_douji_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/shuten_douji_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, collarbone, long_sleeves, looking_at_viewer, makeup, off_shoulder, oni, purple_kimono, revealing_clothes, sakazuki, smile, solo, wide_sleeves, headpiece, open_mouth, obi, open_kimono, sake | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, headpiece, makeup, oni, open_mouth, purple_kimono, revealing_clothes, smile, solo, white_background, collarbone, looking_at_viewer, off_shoulder, simple_background, fang, obi | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, ankle_ribbon, barefoot_sandals, gourd, looking_at_viewer, oni, open_kimono, purple_kimono, revealing_clothes, sakazuki, smile, solo, bare_shoulders, grapes, long_sleeves, makeup, open_mouth, peach, collarbone, headpiece, holding, off_shoulder, red_ribbon, sitting, wide_sleeves, feet, navel, obi, sake, soles, toes | | 3 | 15 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, makeup, obi, oni, smile, solo, wide_sleeves, headpiece, long_sleeves, black_kimono, floral_print, horn_ornament, blush, cherry_blossoms | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, kimono, looking_at_viewer, oni, sakazuki, smile, solo, bare_shoulders, open_mouth, sake, petals, fangs, navel, cherry_blossoms, revealing_clothes, sitting | | 5 | 10 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, detached_sleeves, dudou, headpiece, looking_at_viewer, makeup, oni, smile, solo, thighhighs, blush, forehead_jewel, star_print, heart, rope, thighs, earrings, fundoshi, navel, pointy_ears, closed_mouth, low_twintails, animal, ass, short_twintails, collarbone, sash, toeless_legwear | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, looking_at_viewer, makeup, oni, pointy_ears, solo, open_mouth, red_bow, bracelet, collarbone, glasses, mini_hat, pink_kimono, :d, fangs, red_choker | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, long_sleeves, looking_at_viewer, makeup, oni, solo, black_jacket, blush, headpiece, open_jacket, smile, choker, hooded_sweater, open_mouth, bottle, pink_sweater, drawstring | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, enmaided, looking_at_viewer, maid_headdress, oni, puffy_short_sleeves, solo, waist_apron, white_apron, black_dress, frills, makeup, smile, black_thighhighs, blush, headpiece, maid_apron, thighs, white_background, detached_collar, garter_straps, jewelry, open_mouth, pointy_ears, simple_background | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, long_sleeves, looking_at_viewer, makeup, oni, solo, blush, contemporary, simple_background, thighs, pointy_ears, black_skirt, black_thighhighs, collared_shirt, dress_shirt, grin, pink_sweater, pleated_skirt, pocky, school_uniform, white_background, white_shirt | | 10 | 20 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, jiangshi, long_sleeves, looking_at_viewer, makeup, ofuda, oni, qing_guanmao, smile, solo, wide_sleeves, black_dress, hair_rings, cleavage_cutout, open_mouth, pelvic_curtain, pointy_ears, china_dress, twin_braids, blush, thighs, sash, fang, red_nails, simple_background, sharp_fingernails, white_background, zombie_pose, talisman | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | collarbone | long_sleeves | looking_at_viewer | makeup | off_shoulder | oni | purple_kimono | revealing_clothes | sakazuki | smile | solo | wide_sleeves | headpiece | open_mouth | obi | open_kimono | sake | white_background | simple_background | fang | ankle_ribbon | barefoot_sandals | gourd | grapes | peach | holding | red_ribbon | sitting | feet | navel | soles | toes | black_kimono | floral_print | horn_ornament | blush | cherry_blossoms | kimono | petals | fangs | detached_sleeves | dudou | thighhighs | forehead_jewel | star_print | heart | rope | thighs | earrings | fundoshi | pointy_ears | closed_mouth | low_twintails | animal | ass | short_twintails | sash | toeless_legwear | red_bow | bracelet | glasses | mini_hat | pink_kimono | :d | red_choker | black_jacket | open_jacket | choker | hooded_sweater | bottle | pink_sweater | drawstring | enmaided | maid_headdress | puffy_short_sleeves | waist_apron | white_apron | black_dress | frills | black_thighhighs | maid_apron | detached_collar | garter_straps | jewelry | contemporary | black_skirt | collared_shirt | dress_shirt | grin | pleated_skirt | pocky | school_uniform | white_shirt | jiangshi | ofuda | qing_guanmao | hair_rings | cleavage_cutout | pelvic_curtain | china_dress | twin_braids | red_nails | sharp_fingernails | zombie_pose | talisman | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:-------------|:---------------|:--------------------|:---------|:---------------|:------|:----------------|:--------------------|:-----------|:--------|:-------|:---------------|:------------|:-------------|:------|:--------------|:-------|:-------------------|:--------------------|:-------|:---------------|:-------------------|:--------|:---------|:--------|:----------|:-------------|:----------|:-------|:--------|:--------|:-------|:---------------|:---------------|:----------------|:--------|:------------------|:---------|:---------|:--------|:-------------------|:--------|:-------------|:-----------------|:-------------|:--------|:-------|:---------|:-----------|:-----------|:--------------|:---------------|:----------------|:---------|:------|:------------------|:-------|:------------------|:----------|:-----------|:----------|:-----------|:--------------|:-----|:-------------|:---------------|:--------------|:---------|:-----------------|:---------|:---------------|:-------------|:-----------|:-----------------|:----------------------|:--------------|:--------------|:--------------|:---------|:-------------------|:-------------|:------------------|:----------------|:----------|:---------------|:--------------|:-----------------|:--------------|:-------|:----------------|:--------|:-----------------|:--------------|:-----------|:--------|:---------------|:-------------|:------------------|:-----------------|:--------------|:--------------|:------------|:--------------------|:--------------|:-----------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | X | X | X | X | X | | X | X | | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 15 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | X | X | | X | | | | X | X | X | X | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | X | | | X | | X | X | X | X | | | X | | | X | | | | | | | | | | | X | | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 10 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | | X | X | | X | | | | X | X | | X | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | X | X | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | X | X | X | | X | | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | | X | X | | X | | | | X | X | | X | X | | | | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | X | X | X | | X | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | 10 | 20 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | X | X | X | | X | | | | X | X | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/shuten_douji_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T19:42:17+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T14:42:58+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of shuten\_douji/酒呑童子/酒吞童子 (Fate/Grand Order) ===================================================== This is the dataset of shuten\_douji/酒呑童子/酒吞童子 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'horns, oni\_horns, skin-covered\_horns, bangs, purple\_hair, short\_hair, purple\_eyes, breasts, small\_breasts, bob\_cut, eyeliner, short\_eyebrows', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
952e189f37dd2e59ef7689827bc481e24197ecb2
GoodWiki Dataset in QA format, asking using description and having the question at the end of each page again for the network to learn how to create questions from content
ccore/wikipedia-QA
[ "task_categories:text-generation", "size_categories:10K<n<100K", "wikipeda", "markdown", "qa", "region:us" ]
2023-09-11T19:51:52+00:00
{"size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "tags": ["wikipeda", "markdown", "qa"]}
2023-09-11T20:46:03+00:00
[]
[]
TAGS #task_categories-text-generation #size_categories-10K<n<100K #wikipeda #markdown #qa #region-us
GoodWiki Dataset in QA format, asking using description and having the question at the end of each page again for the network to learn how to create questions from content
[]
[ "TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #wikipeda #markdown #qa #region-us \n" ]
[ 37 ]
[ "passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #wikipeda #markdown #qa #region-us \n" ]
201f5744bd24b47e758470e893a1cf901d1a2633
# Dataset of wicke/ビッケ (Pokémon) This is the dataset of wicke/ビッケ (Pokémon), containing 500 images and their tags. The core tags of this character are `breasts, glasses, purple_hair, green_eyes, pink-framed_eyewear, short_hair, big_hair, large_breasts, huge_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 446.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wicke_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 270.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wicke_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1127 | 537.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wicke_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 402.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wicke_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1127 | 734.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wicke_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/wicke_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, navel, nipples, solo, nude, pussy, looking_at_viewer, smile, simple_background, white_background | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, navel, solo, blush, smile, artist_name, cleavage, black_bikini, collarbone, mature_female, open_mouth | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, pink_sweater, ribbed_sweater, simple_background, smile, solo, turtleneck_sweater, white_background, closed_mouth, long_sleeves, looking_at_viewer, upper_body, white_coat, blush, capelet | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, long_sleeves, looking_at_viewer, pink_sweater, ribbed_sweater, simple_background, smile, solo, turtleneck_sweater, white_background, capelet, open_mouth | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | full_body, long_sleeves, pink_sweater, ribbed_sweater, turtleneck_sweater, capelet, looking_at_viewer, standing, thigh_boots, 1girl, smile, solo, white_skirt, white_footwear, closed_mouth, high_heel_boots, simple_background, hand_on_hip, white_background, white_coat | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, hetero, penis, ribbed_sweater, solo_focus, turtleneck_sweater, blush, ejaculation, mosaic_censoring, pink_sweater, sweater_lift, cum_on_breasts, nipples, paizuri, heart, open_mouth, simple_background, smile, tongue_out | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, 1girl, hetero, penis, uncensored, sex_from_behind, open_mouth, ribbed_sweater, solo_focus, thighhighs, anal, artist_name, ass, blush, testicles, cum_in_pussy, high_heel_boots, pink_sweater, pokephilia, turtleneck_sweater, vaginal | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, outdoors, blue_sky, day, ocean, solo, beach, cloud, navel, smile, collarbone, looking_at_viewer, one_eye_closed, pink_bikini, thighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | navel | nipples | solo | nude | pussy | looking_at_viewer | smile | simple_background | white_background | blush | artist_name | cleavage | black_bikini | collarbone | mature_female | open_mouth | pink_sweater | ribbed_sweater | turtleneck_sweater | closed_mouth | long_sleeves | upper_body | white_coat | capelet | full_body | standing | thigh_boots | white_skirt | white_footwear | high_heel_boots | hand_on_hip | 1boy | hetero | penis | solo_focus | ejaculation | mosaic_censoring | sweater_lift | cum_on_breasts | paizuri | heart | tongue_out | uncensored | sex_from_behind | thighhighs | anal | ass | testicles | cum_in_pussy | pokephilia | vaginal | outdoors | blue_sky | day | ocean | beach | cloud | one_eye_closed | pink_bikini | thighs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------|:-------|:-------|:--------|:--------------------|:--------|:--------------------|:-------------------|:--------|:--------------|:-----------|:---------------|:-------------|:----------------|:-------------|:---------------|:-----------------|:---------------------|:---------------|:---------------|:-------------|:-------------|:----------|:------------|:-----------|:--------------|:--------------|:-----------------|:------------------|:--------------|:-------|:---------|:--------|:-------------|:--------------|:-------------------|:---------------|:-----------------|:----------|:--------|:-------------|:-------------|:------------------|:-------------|:-------|:------|:------------|:---------------|:-------------|:----------|:-----------|:-----------|:------|:--------|:--------|:--------|:-----------------|:--------------|:---------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | | | X | X | X | X | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | | | X | X | X | X | | | | | | | X | X | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 16 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | | X | X | X | X | | | | | | | | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | | | | X | X | | X | | | | | | X | X | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | | | | | | | | X | X | | | | | X | X | X | X | | | | | | | | | | | X | | X | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | X | | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
CyberHarem/wicke_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T19:59:43+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T20:14:31+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of wicke/ビッケ (Pokémon) ============================== This is the dataset of wicke/ビッケ (Pokémon), containing 500 images and their tags. The core tags of this character are 'breasts, glasses, purple\_hair, green\_eyes, pink-framed\_eyewear, short\_hair, big\_hair, large\_breasts, huge\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
194a832818a6b77c0bcb2e6be22b8601b040a5ec
# Dataset Card for "github-issues" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
arnmig/github-issues
[ "region:us" ]
2023-09-11T20:06:14+00:00
{"dataset_info": {"features": [{"name": "url", "dtype": "string"}, {"name": "repository_url", "dtype": "string"}, {"name": "labels_url", "dtype": "string"}, {"name": "comments_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "user", "dtype": "string"}, {"name": "labels", "dtype": "string"}, {"name": "state", "dtype": "string"}, {"name": "locked", "dtype": "bool"}, {"name": "assignee", "dtype": "string"}, {"name": "assignees", "dtype": "string"}, {"name": "milestone", "dtype": "string"}, {"name": "comments", "sequence": "string"}, {"name": "created_at", "dtype": "timestamp[s]"}, {"name": "updated_at", "dtype": "timestamp[s]"}, {"name": "closed_at", "dtype": "string"}, {"name": "author_association", "dtype": "string"}, {"name": "active_lock_reason", "dtype": "string"}, {"name": "draft", "dtype": "string"}, {"name": "pull_request", "dtype": "string"}, {"name": "body", "dtype": "string"}, {"name": "reactions", "dtype": "string"}, {"name": "timeline_url", "dtype": "string"}, {"name": "performed_via_github_app", "dtype": "string"}, {"name": "state_reason", "dtype": "string"}, {"name": "is_pull_request", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 32536203, "num_examples": 6214}], "download_size": 8102507, "dataset_size": 32536203}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-11T20:06:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for "github-issues" More Information needed
[ "# Dataset Card for \"github-issues\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"github-issues\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"github-issues\"\n\nMore Information needed" ]
dd7bf4bf66738613975a281e5b5f5a94fa9e76af
# Dataset Card for "HaluEval" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
notrichardren/HaluEval
[ "region:us" ]
2023-09-11T20:09:34+00:00
{"dataset_info": [{"config_name": "dialogue", "features": [{"name": "knowledge", "dtype": "string"}, {"name": "dialogue_history", "dtype": "string"}, {"name": "right_response", "dtype": "string"}, {"name": "hallucinated_response", "dtype": "string"}, {"name": "task_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6332598, "num_examples": 10000}], "download_size": 3451421, "dataset_size": 6332598}, {"config_name": "general", "features": [{"name": "user_query", "dtype": "string"}, {"name": "chatgpt_response", "dtype": "string"}, {"name": "hallucination_label", "dtype": "string"}, {"name": "task_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3010941, "num_examples": 5000}], "download_size": 1849332, "dataset_size": 3010941}, {"config_name": "qa", "features": [{"name": "knowledge", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "right_answer", "dtype": "string"}, {"name": "hallucinated_answer", "dtype": "string"}, {"name": "task_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5546422, "num_examples": 10000}], "download_size": 3753464, "dataset_size": 5546422}, {"config_name": "summarization", "features": [{"name": "document", "dtype": "string"}, {"name": "right_summary", "dtype": "string"}, {"name": "hallucinated_summary", "dtype": "string"}, {"name": "task_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 46578787, "num_examples": 10000}], "download_size": 27986765, "dataset_size": 46578787}], "configs": [{"config_name": "dialogue", "data_files": [{"split": "train", "path": "dialogue/train-*"}]}, {"config_name": "general", "data_files": [{"split": "train", "path": "general/train-*"}]}, {"config_name": "qa", "data_files": [{"split": "train", "path": "qa/train-*"}]}, {"config_name": "summarization", "data_files": [{"split": "train", "path": "summarization/train-*"}]}]}
2023-09-11T20:09:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for "HaluEval" More Information needed
[ "# Dataset Card for \"HaluEval\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"HaluEval\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"HaluEval\"\n\nMore Information needed" ]
f2220c8483eb01c0e962ee37b5d32f4879fb0b74
# Dataset Card for "ChatDoctor-iCliniq" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lavita/ChatDoctor-iCliniq
[ "region:us" ]
2023-09-11T20:11:18+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "answer_icliniq", "dtype": "string"}, {"name": "answer_chatgpt", "dtype": "string"}, {"name": "answer_chatdoctor", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16962106, "num_examples": 7321}], "download_size": 9373080, "dataset_size": 16962106}}
2023-09-11T20:13:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "ChatDoctor-iCliniq" More Information needed
[ "# Dataset Card for \"ChatDoctor-iCliniq\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"ChatDoctor-iCliniq\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"ChatDoctor-iCliniq\"\n\nMore Information needed" ]
41f497a95ab727c43adb47f66179ebc703d43c04
# Just Eval Instruct ## Highlights <div class="col-md-12"> <ul> <li><b>Data sources:</b> <a href="https://huggingface.co/datasets/tatsu-lab/alpaca_eval" target="_blank">AlpacaEval</a> (covering 5 datasets), <a href="https://huggingface.co/datasets/GAIR/lima/viewer/plain_text/test" target="_blank">LIMA-test</a>, <a href="https://huggingface.co/datasets/HuggingFaceH4/mt_bench_prompts" target="_blank">MT-bench</a>, <a href="https://huggingface.co/datasets/Anthropic/hh-rlhf/tree/main/red-team-attempts" target="_blank">Anthropic red-teaming</a>, and <a href="https://github.com/Princeton-SysML/Jailbreak_LLM/blob/main/data/MaliciousInstruct.txt" target="_blank">MaliciousInstruct</a>. </li> <li><b>1K examples:</b> 1,000 instructions, including 800 for problem-solving test, and 200 specifically for safety test. </li> <li><b>Category:</b> We tag each example with (one or multiple) labels on its task types and topics. </li> </ul> </div> ## Distribution ![](http://localhost:3000/images/eval_1.png)
re-align/just-eval-instruct
[ "region:us" ]
2023-09-11T20:42:48+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "test_all_with_tags.jsonl"}]}, {"config_name": "responses", "data_files": [{"split": "gpt_4_0613", "path": "responses/gpt-4-0613.json"}, {"split": "gpt_4_0314", "path": "responses/gpt-4-0314.json"}, {"split": "gpt_3.5_turbo_0301", "path": "responses/gpt-3.5-turbo-0301.json"}, {"split": "Mistral_7B_Instruct_v0.1", "path": "responses/Mistral-7B-Instruct-v0.1.json"}, {"split": "Llama_2_13b_chat_hf", "path": "responses/Llama-2-13b-chat-hf.json"}, {"split": "Llama_2_70B_chat_GPTQ", "path": "responses/Llama-2-70B-chat-GPTQ.json"}, {"split": "Llama_2_7b_chat_hf", "path": "responses/Llama-2-7b-chat-hf.json"}, {"split": "vicuna_13b_v1.5", "path": "responses/vicuna-13b-v1.5.json"}, {"split": "vicuna_7b_v1.5", "path": "responses/vicuna-7b-v1.5.json"}]}, {"config_name": "judgements_main", "data_files": [{"split": "Mistral_7B_Instruct_v0.1", "path": "judgements/main/Mistral-7B-Instruct-v0.1.json"}, {"split": "gpt_4_0613", "path": "judgements/main/gpt-4-0613.json"}, {"split": "gpt_4_0314", "path": "judgements/main/gpt-4-0314.json"}, {"split": "Llama_2_70B_chat_GPTQ", "path": "judgements/main/Llama-2-70B-chat-GPTQ.json"}, {"split": "Llama_2_13b_chat_hf", "path": "judgements/main/Llama-2-13b-chat-hf.json"}, {"split": "vicuna_7b_v1.5", "path": "judgements/main/vicuna-7b-v1.5.json"}, {"split": "vicuna_13b_v1.5", "path": "judgements/main/vicuna-13b-v1.5.json"}, {"split": "gpt_3.5_turbo_0301", "path": "judgements/main/gpt-3.5-turbo-0301.json"}, {"split": "Llama_2_7b_chat_hf", "path": "judgements/main/Llama-2-7b-chat-hf.json"}]}, {"config_name": "judgements_safety", "data_files": [{"split": "Mistral_7B_Instruct_v0.1", "path": "judgements/safety/Mistral-7B-Instruct-v0.1.json"}, {"split": "gpt_4_0613", "path": "judgements/safety/gpt-4-0613.json"}, {"split": "gpt_4_0314", "path": "judgements/safety/gpt-4-0314.json"}, {"split": "Llama_2_70B_chat_GPTQ", "path": "judgements/safety/Llama-2-70B-chat-GPTQ.json"}, {"split": "Llama_2_13b_chat_hf", "path": "judgements/safety/Llama-2-13b-chat-hf.json"}, {"split": "vicuna_7b_v1.5", "path": "judgements/safety/vicuna-7b-v1.5.json"}, {"split": "vicuna_13b_v1.5", "path": "judgements/safety/vicuna-13b-v1.5.json"}, {"split": "gpt_3.5_turbo_0301", "path": "judgements/safety/gpt-3.5-turbo-0301.json"}, {"split": "Llama_2_7b_chat_hf", "path": "judgements/safety/Llama-2-7b-chat-hf.json"}]}]}
2023-12-05T00:53:11+00:00
[]
[]
TAGS #region-us
# Just Eval Instruct ## Highlights <div class="col-md-12"> <ul> <li><b>Data sources:</b> <a href="URL target="_blank">AlpacaEval</a> (covering 5 datasets), <a href="URL target="_blank">LIMA-test</a>, <a href="URL target="_blank">MT-bench</a>, <a href="URL target="_blank">Anthropic red-teaming</a>, and <a href="URL target="_blank">MaliciousInstruct</a>. </li> <li><b>1K examples:</b> 1,000 instructions, including 800 for problem-solving test, and 200 specifically for safety test. </li> <li><b>Category:</b> We tag each example with (one or multiple) labels on its task types and topics. </li> </ul> </div> ## Distribution ![](http://localhost:3000/images/eval_1.png)
[ "# Just Eval Instruct", "## Highlights\n\n<div class=\"col-md-12\"> \n <ul>\n <li><b>Data sources:</b> \n <a href=\"URL target=\"_blank\">AlpacaEval</a> (covering 5 datasets), \n <a href=\"URL target=\"_blank\">LIMA-test</a>, \n <a href=\"URL target=\"_blank\">MT-bench</a>, \n <a href=\"URL target=\"_blank\">Anthropic red-teaming</a>, \n and <a href=\"URL target=\"_blank\">MaliciousInstruct</a>. </li>\n <li><b>1K examples:</b> 1,000 instructions, including 800 for problem-solving test, and 200 specifically for safety test. </li>\n <li><b>Category:</b> We tag each example with (one or multiple) labels on its task types and topics. </li> \n </ul> \n</div>", "## Distribution\n\n![](http://localhost:3000/images/eval_1.png)" ]
[ "TAGS\n#region-us \n", "# Just Eval Instruct", "## Highlights\n\n<div class=\"col-md-12\"> \n <ul>\n <li><b>Data sources:</b> \n <a href=\"URL target=\"_blank\">AlpacaEval</a> (covering 5 datasets), \n <a href=\"URL target=\"_blank\">LIMA-test</a>, \n <a href=\"URL target=\"_blank\">MT-bench</a>, \n <a href=\"URL target=\"_blank\">Anthropic red-teaming</a>, \n and <a href=\"URL target=\"_blank\">MaliciousInstruct</a>. </li>\n <li><b>1K examples:</b> 1,000 instructions, including 800 for problem-solving test, and 200 specifically for safety test. </li>\n <li><b>Category:</b> We tag each example with (one or multiple) labels on its task types and topics. </li> \n </ul> \n</div>", "## Distribution\n\n![](http://localhost:3000/images/eval_1.png)" ]
[ 6, 6, 217, 23 ]
[ "passage: TAGS\n#region-us \n# Just Eval Instruct## Highlights\n\n<div class=\"col-md-12\"> \n <ul>\n <li><b>Data sources:</b> \n <a href=\"URL target=\"_blank\">AlpacaEval</a> (covering 5 datasets), \n <a href=\"URL target=\"_blank\">LIMA-test</a>, \n <a href=\"URL target=\"_blank\">MT-bench</a>, \n <a href=\"URL target=\"_blank\">Anthropic red-teaming</a>, \n and <a href=\"URL target=\"_blank\">MaliciousInstruct</a>. </li>\n <li><b>1K examples:</b> 1,000 instructions, including 800 for problem-solving test, and 200 specifically for safety test. </li>\n <li><b>Category:</b> We tag each example with (one or multiple) labels on its task types and topics. </li> \n </ul> \n</div>## Distribution\n\n![](http://localhost:3000/images/eval_1.png)" ]
fef9c0f19628f16e431401342024714427ac3eee
# Dataset Card for Evaluation run of jondurbin/airocoder-34b-2.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/jondurbin/airocoder-34b-2.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [jondurbin/airocoder-34b-2.1](https://huggingface.co/jondurbin/airocoder-34b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T17:47:22.739718](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1/blob/main/results_2023-10-28T17-47-22.739718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.31669463087248323, "em_stderr": 0.004763952451764173, "f1": 0.3681816275167802, "f1_stderr": 0.0047033328815527095, "acc": 0.3913430865625522, "acc_stderr": 0.010251830385905714 }, "harness|drop|3": { "em": 0.31669463087248323, "em_stderr": 0.004763952451764173, "f1": 0.3681816275167802, "f1_stderr": 0.0047033328815527095 }, "harness|gsm8k|5": { "acc": 0.08339651250947688, "acc_stderr": 0.007615650277106696 }, "harness|winogrande|5": { "acc": 0.6992896606156275, "acc_stderr": 0.01288801049470473 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1
[ "region:us" ]
2023-09-11T20:47:53+00:00
{"pretty_name": "Evaluation run of jondurbin/airocoder-34b-2.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airocoder-34b-2.1](https://huggingface.co/jondurbin/airocoder-34b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T17:47:22.739718](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airocoder-34b-2.1/blob/main/results_2023-10-28T17-47-22.739718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31669463087248323,\n \"em_stderr\": 0.004763952451764173,\n \"f1\": 0.3681816275167802,\n \"f1_stderr\": 0.0047033328815527095,\n \"acc\": 0.3913430865625522,\n \"acc_stderr\": 0.010251830385905714\n },\n \"harness|drop|3\": {\n \"em\": 0.31669463087248323,\n \"em_stderr\": 0.004763952451764173,\n \"f1\": 0.3681816275167802,\n \"f1_stderr\": 0.0047033328815527095\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08339651250947688,\n \"acc_stderr\": 0.007615650277106696\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6992896606156275,\n \"acc_stderr\": 0.01288801049470473\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airocoder-34b-2.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|arc:challenge|25_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T17_47_22.739718", "path": ["**/details_harness|drop|3_2023-10-28T17-47-22.739718.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T17-47-22.739718.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T17_47_22.739718", "path": ["**/details_harness|gsm8k|5_2023-10-28T17-47-22.739718.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T17-47-22.739718.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hellaswag|10_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-11T21-47-37.298626.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-11T21-47-37.298626.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T17_47_22.739718", "path": ["**/details_harness|winogrande|5_2023-10-28T17-47-22.739718.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T17-47-22.739718.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_11T21_47_37.298626", "path": ["results_2023-09-11T21-47-37.298626.parquet"]}, {"split": "2023_10_28T17_47_22.739718", "path": ["results_2023-10-28T17-47-22.739718.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T17-47-22.739718.parquet"]}]}]}
2023-10-28T16:47:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jondurbin/airocoder-34b-2.1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model jondurbin/airocoder-34b-2.1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T17:47:22.739718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of jondurbin/airocoder-34b-2.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airocoder-34b-2.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T17:47:22.739718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jondurbin/airocoder-34b-2.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airocoder-34b-2.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T17:47:22.739718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airocoder-34b-2.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airocoder-34b-2.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T17:47:22.739718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
25cf779f593afd6364dfcc67b9f80986c96f0248
# Dataset of futaba_anzu/双葉杏/후타바안즈 (THE iDOLM@STER: Cinderella Girls) This is the dataset of futaba_anzu/双葉杏/후타바안즈 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are `long_hair, blonde_hair, twintails, brown_eyes, low_twintails, very_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 511.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_anzu_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 332.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_anzu_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1139 | 688.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_anzu_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 466.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_anzu_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1139 | 906.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futaba_anzu_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/futaba_anzu_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 58 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, stuffed_animal, t-shirt, stuffed_bunny, clothes_writing, open_mouth, striped, off_shoulder, barefoot | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, hat, solo, stuffed_animal, stuffed_bunny, smile, open_mouth, blue_thighhighs, skirt, v | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, hat, solo, stuffed_animal, stuffed_bunny, wrist_cuffs, necktie, open_mouth, smile, v | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, simple_background, solo, stuffed_animal, stuffed_bunny, white_background, bangs, blush, short_sleeves, white_shirt, t-shirt, holding_stuffed_toy, open_mouth, upper_body, object_hug, smile | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, hair_bow, smile, solo, blush, looking_at_viewer, polka_dot_bow, open_mouth, dress, strawberry, hair_flower, twin_braids, candy | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, hair_bow, hair_flower, looking_at_viewer, polka_dot_bow, solo, sleeveless_dress, strawberry, twin_braids, wrist_cuffs, bangs, blue_bow, blush, full_body, open_mouth, wings, :d, layered_dress, ribbon, sandals, simple_background, striped, stuffed_animal, stuffed_bunny, bare_shoulders, candy, holding, standing, white_background, frilled_choker, pink_dress, red_rose, white_footwear | | 6 | 30 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, solo, blush, bangs, looking_at_viewer, blue_dress, long_sleeves, sleeves_past_wrists, smile, white_background, blue_bow, hair_bow, cardigan, open_clothes, simple_background, open_mouth, ahoge, plaid_dress, brown_jacket, striped_bow | | 7 | 8 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, solo, crown, demon_tail, detached_sleeves, pointy_ears, striped_thighhighs, wings, skirt, blush, garter_straps, looking_at_viewer, twin_braids, animal_ears, halloween, jack-o'-lantern, midriff, navel, swirl_lollipop, trident | | 8 | 10 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | enmaided, 1girl, black_dress, maid_headdress, solo, blush, looking_at_viewer, bangs, twin_braids, open_mouth, puffy_short_sleeves, white_apron, wrist_cuffs, frilled_apron, holding, ribbon, food, maid_apron | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, blush, collarbone, frilled_bikini, innertube, solo, bangs, hair_flower, looking_at_viewer, bare_arms, bare_shoulders, holding_food, pink_bikini, popsicle, smile, water, white_flower, afloat, bare_legs, blue_bikini, blue_flower, brown_footwear, day, food_in_mouth, full_body, hair_between_eyes, inflatable_toy, navel, open_mouth, red_eyes, sandals, side-tie_bikini_bottom, sidelocks, yellow_flower | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | stuffed_animal | t-shirt | stuffed_bunny | clothes_writing | open_mouth | striped | off_shoulder | barefoot | hat | smile | blue_thighhighs | skirt | v | wrist_cuffs | necktie | looking_at_viewer | simple_background | white_background | bangs | blush | short_sleeves | white_shirt | holding_stuffed_toy | upper_body | object_hug | hair_bow | polka_dot_bow | dress | strawberry | hair_flower | twin_braids | candy | sleeveless_dress | blue_bow | full_body | wings | :d | layered_dress | ribbon | sandals | bare_shoulders | holding | standing | frilled_choker | pink_dress | red_rose | white_footwear | blue_dress | long_sleeves | sleeves_past_wrists | cardigan | open_clothes | ahoge | plaid_dress | brown_jacket | striped_bow | crown | demon_tail | detached_sleeves | pointy_ears | striped_thighhighs | garter_straps | animal_ears | halloween | jack-o'-lantern | midriff | navel | swirl_lollipop | trident | enmaided | black_dress | maid_headdress | puffy_short_sleeves | white_apron | frilled_apron | food | maid_apron | collarbone | frilled_bikini | innertube | bare_arms | holding_food | pink_bikini | popsicle | water | white_flower | afloat | bare_legs | blue_bikini | blue_flower | brown_footwear | day | food_in_mouth | hair_between_eyes | inflatable_toy | red_eyes | side-tie_bikini_bottom | sidelocks | yellow_flower | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:----------|:----------------|:------------------|:-------------|:----------|:---------------|:-----------|:------|:--------|:------------------|:--------|:----|:--------------|:----------|:--------------------|:--------------------|:-------------------|:--------|:--------|:----------------|:--------------|:----------------------|:-------------|:-------------|:-----------|:----------------|:--------|:-------------|:--------------|:--------------|:--------|:-------------------|:-----------|:------------|:--------|:-----|:----------------|:---------|:----------|:-----------------|:----------|:-----------|:-----------------|:-------------|:-----------|:-----------------|:-------------|:---------------|:----------------------|:-----------|:---------------|:--------|:--------------|:---------------|:--------------|:--------|:-------------|:-------------------|:--------------|:---------------------|:----------------|:--------------|:------------|:------------------|:----------|:--------|:-----------------|:----------|:-----------|:--------------|:-----------------|:----------------------|:--------------|:----------------|:-------|:-------------|:-------------|:-----------------|:------------|:------------|:---------------|:--------------|:-----------|:--------|:---------------|:---------|:------------|:--------------|:--------------|:-----------------|:------|:----------------|:--------------------|:-----------------|:-----------|:-------------------------|:------------|:----------------| | 0 | 58 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | X | | X | | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | | X | | | | | X | | | | | | X | | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | | X | | X | X | | | | | | | | X | | X | X | X | X | X | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 30 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | | | | X | | | | | X | | | | | | X | X | X | X | X | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 8 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | | | | | | | | | | X | | | | X | | | | X | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 10 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | | | X | | | | | | | | | X | | X | | | X | X | | | | | | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | | | | X | | | | | X | | | | | | X | | | X | X | | | | | | | | | | X | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/futaba_anzu_idolmastercinderellagirls
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T21:22:21+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T09:22:13+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of futaba\_anzu/双葉杏/후타바안즈 (THE iDOLM@STER: Cinderella Girls) ==================================================================== This is the dataset of futaba\_anzu/双葉杏/후타바안즈 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are 'long\_hair, blonde\_hair, twintails, brown\_eyes, low\_twintails, very\_long\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
87feb9ef7250df48afa3d1dbe82f0cff163ec3b4
# Dataset of asuna/アスナ (Pokémon) This is the dataset of asuna/アスナ (Pokémon), containing 500 images and their tags. The core tags of this character are `red_hair, breasts, long_hair, red_eyes, ponytail, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 472.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 294.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1082 | 583.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 428.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1082 | 781.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuna_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/asuna_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, navel, nipples, solo, no_bra, pants_pull, pussy, shirt_lift, smile, jeans, no_panties, blush, female_pubic_hair, hair_over_one_eye, holding_poke_ball, medium_breasts, poke_ball_(basic), undressing | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, crop_top, midriff, navel, :d, looking_at_viewer, open_mouth, solo, collarbone, tied_shirt, bangs, belt, simple_background, black_shirt, holding_poke_ball, poke_ball_(basic), white_background, cleavage, green_pants, sleeveless, hand_on_hip, standing, teeth | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, holding_poke_ball, poke_ball_(basic), bangs, cleavage, crop_top, looking_at_viewer, midriff, solo, belt, navel, teeth, tied_shirt, black_shirt, collarbone, green_pants, grin, hair_tie, jeans, sleeveless | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, crop_top, holding_poke_ball, midriff, poke_ball_(basic), smile, navel, open_mouth, fire, pokemon_(creature), belt, cleavage, hair_over_one_eye, jeans, looking_at_viewer, solo | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, crop_top, cropped_shirt, jeans, midriff, navel, smile, solo, looking_at_viewer, white_background, black_shirt, open_mouth, red_belt, simple_background | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, nipples, solo, blush, nude, one_eye_closed, open_mouth, onsen, smile, steam, towel, water | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, nipples, shirt_lift, solo, hair_over_one_eye, navel, blush, bottomless, open_mouth, pink_hair, pussy | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, hetero, nipples, sex, vaginal, 1boy, navel, penis, pussy, solo_focus, sweat, bar_censor, open_mouth, spread_legs, completely_nude, missionary, teeth | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1boy, 1girl, completely_nude, hetero, nipples, ass, blush, mixed_bathing, onsen, open_mouth, sex_from_behind, water, doggystyle, cum_in_pussy, looking_back, vaginal | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1boy, 1girl, hetero, nipples, penis, solo_focus, blush, facial, open_mouth, cum_in_mouth, censored, cum_on_body, ejaculation, nude, paizuri, shirt_lift, smile, sweat | | 10 | 9 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1boy, 1girl, hetero, nude, solo_focus, uncensored, blush, cum, hair_tie, nipples, open_mouth, licking_penis, saliva, tongue | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | navel | nipples | solo | no_bra | pants_pull | pussy | shirt_lift | smile | jeans | no_panties | blush | female_pubic_hair | hair_over_one_eye | holding_poke_ball | medium_breasts | poke_ball_(basic) | undressing | crop_top | midriff | :d | looking_at_viewer | open_mouth | collarbone | tied_shirt | bangs | belt | simple_background | black_shirt | white_background | cleavage | green_pants | sleeveless | hand_on_hip | standing | teeth | grin | hair_tie | fire | pokemon_(creature) | cropped_shirt | red_belt | nude | one_eye_closed | onsen | steam | towel | water | bottomless | pink_hair | hetero | sex | vaginal | 1boy | penis | solo_focus | sweat | bar_censor | spread_legs | completely_nude | missionary | ass | mixed_bathing | sex_from_behind | doggystyle | cum_in_pussy | looking_back | facial | cum_in_mouth | censored | cum_on_body | ejaculation | paizuri | uncensored | cum | licking_penis | saliva | tongue | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:----------|:-------|:---------|:-------------|:--------|:-------------|:--------|:--------|:-------------|:--------|:--------------------|:--------------------|:--------------------|:-----------------|:--------------------|:-------------|:-----------|:----------|:-----|:--------------------|:-------------|:-------------|:-------------|:--------|:-------|:--------------------|:--------------|:-------------------|:-----------|:--------------|:-------------|:--------------|:-----------|:--------|:-------|:-----------|:-------|:---------------------|:----------------|:-----------|:-------|:-----------------|:--------|:--------|:--------|:--------|:-------------|:------------|:---------|:------|:----------|:-------|:--------|:-------------|:--------|:-------------|:--------------|:------------------|:-------------|:------|:----------------|:------------------|:-------------|:---------------|:---------------|:---------|:---------------|:-----------|:--------------|:--------------|:----------|:-------------|:------|:----------------|:---------|:---------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | | | | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | | | | | | X | | | | | X | | X | | X | X | | X | | X | X | X | X | | X | | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | | | | | X | X | | | | X | X | | X | | X | X | | X | X | | | | X | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | | | | | X | X | | | | | | | | | X | X | | X | X | | | | | X | X | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | | | | | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | | | X | X | | | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | | | | X | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | | X | X | | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | X | | | | | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | X | X | X | X | X | X | | | | | | | 10 | 9 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | X | | | X | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X |
CyberHarem/asuna_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T21:22:27+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T20:19:51+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of asuna/アスナ (Pokémon) ============================== This is the dataset of asuna/アスナ (Pokémon), containing 500 images and their tags. The core tags of this character are 'red\_hair, breasts, long\_hair, red\_eyes, ponytail, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
9af896d207876e403f5a1a88452ba1699a7170b3
# Dataset of xuanzang_sanzang/玄奘三蔵/玄奘三藏 (Fate/Grand Order) This is the dataset of xuanzang_sanzang/玄奘三蔵/玄奘三藏 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `long_hair, breasts, large_breasts, brown_hair, earrings, hair_between_eyes, hoop_earrings, hat, purple_eyes, bangs, black_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 655.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xuanzang_sanzang_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 373.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xuanzang_sanzang_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1226 | 786.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xuanzang_sanzang_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 578.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xuanzang_sanzang_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1226 | 1.08 GiB | [Download](https://huggingface.co/datasets/CyberHarem/xuanzang_sanzang_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/xuanzang_sanzang_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cleavage, looking_at_viewer, prayer_beads, solo, bead_necklace, open_mouth, white_bikini, blush, smile, bare_shoulders, navel, white_thighhighs | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bead_necklace, bikini_top_only, cleavage, grin, prayer_beads, solo, looking_at_viewer, white_bikini, blush, upper_body | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cleavage, necklace, prayer_beads, smile, solo, thighhighs, looking_at_viewer, red_eyes, staff, bikini_top_only | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1boy, 1girl, hetero, paizuri, penis, prayer_beads, solo_focus, bead_necklace, bikini, breasts_squeezed_together, cum_on_breasts, looking_at_viewer, bar_censor, cleavage, facial, nose_blush, out_of_frame, sweat, brown_eyes, heavy_breathing, open_mouth, pov | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1boy, 1girl, bead_necklace, breasts_squeezed_together, hetero, nipples, paizuri, penis, prayer_beads, solo_focus, looking_at_viewer, sweat, huge_breasts, male_pubic_hair, open_mouth, bar_censor, closed_mouth, ejaculation, nose_blush, pov_crotch, smile, white_bikini | | 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, bead_necklace, blush, hetero, navel, prayer_beads, spread_legs, thighs, looking_at_viewer, nipples, penis, purple_bikini, sweat, thighhighs, open_mouth, sex, vaginal, bare_shoulders, mosaic_censoring, on_back, bridal_gauntlets, clothing_aside, cum_in_pussy, detached_sleeves, missionary, smile, solo_focus | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | blush, covered_nipples, fake_animal_ears, heart_pasties, looking_at_viewer, rabbit_ears, reverse_bunnysuit, 1girl, long_sleeves, revealing_clothes, shrug_(clothing), solo, thighs, wrist_cuffs, maebari, navel, open_mouth, bead_necklace, bridal_gauntlets, grin, headpiece, prayer_beads, swimsuit | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | prayer_beads | solo | bead_necklace | open_mouth | white_bikini | blush | smile | bare_shoulders | navel | white_thighhighs | bikini_top_only | grin | upper_body | necklace | thighhighs | red_eyes | staff | 1boy | hetero | paizuri | penis | solo_focus | bikini | breasts_squeezed_together | cum_on_breasts | bar_censor | facial | nose_blush | out_of_frame | sweat | brown_eyes | heavy_breathing | pov | nipples | huge_breasts | male_pubic_hair | closed_mouth | ejaculation | pov_crotch | spread_legs | thighs | purple_bikini | sex | vaginal | mosaic_censoring | on_back | bridal_gauntlets | clothing_aside | cum_in_pussy | detached_sleeves | missionary | covered_nipples | fake_animal_ears | heart_pasties | rabbit_ears | reverse_bunnysuit | long_sleeves | revealing_clothes | shrug_(clothing) | wrist_cuffs | maebari | headpiece | swimsuit | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:---------------|:-------|:----------------|:-------------|:---------------|:--------|:--------|:-----------------|:--------|:-------------------|:------------------|:-------|:-------------|:-----------|:-------------|:-----------|:--------|:-------|:---------|:----------|:--------|:-------------|:---------|:----------------------------|:-----------------|:-------------|:---------|:-------------|:---------------|:--------|:-------------|:------------------|:------|:----------|:---------------|:------------------|:---------------|:--------------|:-------------|:--------------|:---------|:----------------|:------|:----------|:-------------------|:----------|:-------------------|:-----------------|:---------------|:-------------------|:-------------|:------------------|:-------------------|:----------------|:--------------|:--------------------|:---------------|:--------------------|:-------------------|:--------------|:----------|:------------|:-----------| | 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | | | | | X | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | | X | X | X | | X | | | | | | | | | | | X | X | X | X | X | | X | | X | | X | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | | X | X | | X | X | X | X | | | | | | X | | | X | X | | X | X | | | | | | | | X | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | X | X | X | X | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/xuanzang_sanzang_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T21:32:25+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-12T19:44:15+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of xuanzang\_sanzang/玄奘三蔵/玄奘三藏 (Fate/Grand Order) ========================================================= This is the dataset of xuanzang\_sanzang/玄奘三蔵/玄奘三藏 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'long\_hair, breasts, large\_breasts, brown\_hair, earrings, hair\_between\_eyes, hoop\_earrings, hat, purple\_eyes, bangs, black\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
0279488218a0c46bf7f559ca123145e491ddf688
# Dataset of nagi (Pokémon) This is the dataset of nagi (Pokémon), containing 88 images and their tags. The core tags of this character are `long_hair, purple_hair, purple_eyes, breasts, eyelashes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 88 | 81.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagi_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 88 | 51.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagi_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 168 | 94.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagi_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 88 | 74.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagi_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 168 | 124.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagi_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/nagi_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, helmet, pokemon_(creature), closed_mouth, long_sleeves, white_gloves, smile, blush, looking_at_viewer, sky, very_long_hair, cloud, collarbone, floating_hair | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, helmet, open_mouth, :d, long_sleeves, tongue, white_gloves, pokemon_(creature), blue_bodysuit, cloud, collarbone, day, outdoors, sky, solo | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | helmet | pokemon_(creature) | closed_mouth | long_sleeves | white_gloves | smile | blush | looking_at_viewer | sky | very_long_hair | cloud | collarbone | floating_hair | open_mouth | :d | tongue | blue_bodysuit | day | outdoors | solo | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------------------|:---------------|:---------------|:---------------|:--------|:--------|:--------------------|:------|:-----------------|:--------|:-------------|:----------------|:-------------|:-----|:---------|:----------------|:------|:-----------|:-------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | X | | | | X | | X | X | | X | X | X | X | X | X | X |
CyberHarem/nagi_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T21:35:20+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T19:09:42+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of nagi (Pokémon) ========================= This is the dataset of nagi (Pokémon), containing 88 images and their tags. The core tags of this character are 'long\_hair, purple\_hair, purple\_eyes, breasts, eyelashes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
491bd77b8f39fa794169afd984ef040dcfe3558f
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-24T22:45:34.722006](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096/blob/main/results_2023-10-24T22-45-34.722006.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0005243288590604027, "em_stderr": 0.00023443780464836852, "f1": 0.03530411073825506, "f1_stderr": 0.0010228606256930922, "acc": 0.24775172887546654, "acc_stderr": 0.007871628031487186 }, "harness|drop|3": { "em": 0.0005243288590604027, "em_stderr": 0.00023443780464836852, "f1": 0.03530411073825506, "f1_stderr": 0.0010228606256930922 }, "harness|gsm8k|5": { "acc": 0.0037907505686125853, "acc_stderr": 0.0016927007401501798 }, "harness|winogrande|5": { "acc": 0.49171270718232046, "acc_stderr": 0.014050555322824194 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096
[ "region:us" ]
2023-09-11T21:53:14+00:00
{"pretty_name": "Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T22:45:34.722006](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-169M-20230520-done-ctx4096/blob/main/results_2023-10-24T22-45-34.722006.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464836852,\n \"f1\": 0.03530411073825506,\n \"f1_stderr\": 0.0010228606256930922,\n \"acc\": 0.24775172887546654,\n \"acc_stderr\": 0.007871628031487186\n },\n \"harness|drop|3\": {\n \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464836852,\n \"f1\": 0.03530411073825506,\n \"f1_stderr\": 0.0010228606256930922\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401501798\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.49171270718232046,\n \"acc_stderr\": 0.014050555322824194\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|arc:challenge|25_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T22_45_34.722006", "path": ["**/details_harness|drop|3_2023-10-24T22-45-34.722006.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T22-45-34.722006.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T22_45_34.722006", "path": ["**/details_harness|gsm8k|5_2023-10-24T22-45-34.722006.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T22-45-34.722006.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hellaswag|10_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-11T22-53-03.522910.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-11T22-53-03.522910.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-11T22-53-03.522910.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T22_45_34.722006", "path": ["**/details_harness|winogrande|5_2023-10-24T22-45-34.722006.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T22-45-34.722006.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_11T22_53_03.522910", "path": ["results_2023-09-11T22-53-03.522910.parquet"]}, {"split": "2023_10_24T22_45_34.722006", "path": ["results_2023-10-24T22-45-34.722006.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T22-45-34.722006.parquet"]}]}]}
2023-10-24T21:45:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-24T22:45:34.722006(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-24T22:45:34.722006(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-24T22:45:34.722006(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 35, 31, 183, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-169M-20230520-done-ctx4096 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T22:45:34.722006(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
cbe387d48d6b659166e37f51d324b7079f149978
# Dataset Card for "notable_take_home" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Siddharthr30/notable_take_home
[ "region:us" ]
2023-09-11T22:08:22+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "labels", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 671512, "num_examples": 2628}, {"name": "validation", "num_bytes": 222336, "num_examples": 876}, {"name": "test", "num_bytes": 226127, "num_examples": 876}], "download_size": 0, "dataset_size": 1119975}}
2023-09-12T01:16:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "notable_take_home" More Information needed
[ "# Dataset Card for \"notable_take_home\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"notable_take_home\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"notable_take_home\"\n\nMore Information needed" ]
f6fd6c1becf56fea2331acf573aaeaeb7e411701
# Dataset of minamoto_no_raikou/源頼光/源赖光 (Fate/Grand Order) This is the dataset of minamoto_no_raikou/源頼光/源赖光 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `purple_hair, long_hair, breasts, purple_eyes, very_long_hair, large_breasts, bangs, parted_bangs, low-tied_long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 789.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minamoto_no_raikou_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 419.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minamoto_no_raikou_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1195 | 836.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minamoto_no_raikou_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 682.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/minamoto_no_raikou_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1195 | 1.19 GiB | [Download](https://huggingface.co/datasets/CyberHarem/minamoto_no_raikou_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/minamoto_no_raikou_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_gloves, fingerless_gloves, katana, looking_at_viewer, purple_bodysuit, rope, solo, tabard, holding_sword, japanese_armor, ribbed_sleeves, arm_guards, covered_navel, smile, loincloth | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_gloves, fingerless_gloves, katana, looking_at_viewer, simple_background, smile, solo, white_background, holding_sword, ribbed_sleeves, tabard, purple_bodysuit, covered_navel, rope | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_gloves, fingerless_gloves, looking_at_viewer, purple_bodysuit, simple_background, solo, white_background, ribbed_sleeves, rope, smile, tabard, arm_guards | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, looking_at_viewer, ribbed_sweater, smile, solo, turtleneck_sweater, huge_breasts, purple_sweater, simple_background | | 4 | 14 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, armlet, collarbone, eyepatch_bikini, looking_at_viewer, purple_bikini, solo, bare_shoulders, choker, cleavage, thighs, navel, smile, beads, blue_sky, fingerless_gloves, single_glove, blush, rope, beach, black_gloves, day | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, blush, nipples, collarbone, completely_nude, sweat, thighs, 1boy, hetero, looking_at_viewer, navel, open_mouth, sex, mosaic_censoring, penis, pussy, vaginal, girl_on_top, solo, spread_legs, white_background | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bare_shoulders, blush, looking_at_viewer, navel, purple_bra, purple_panties, solo, cleavage, thighs, underwear_only, collarbone, lingerie, parted_lips, smile, bow_panties, gradient_background, lace_trim, simple_background | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, collared_shirt, dress_shirt, looking_at_viewer, office_lady, solo, white_shirt, alternate_costume, pencil_skirt, black_skirt, simple_background, smile, white_background, formal, huge_breasts, jacket, long_sleeves, pantyhose, suit | | 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, looking_at_viewer, smile, solo, obi, purple_kimono, cleavage, floral_print, hair_flower, new_year, wide_sleeves | | 9 | 20 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, solo, looking_at_viewer, purple_dress, bare_shoulders, china_dress, earrings, smile, gloves, feather_boa, folding_fan, holding_fan, side_slit, blush, sitting, thighs, simple_background, sleeveless, white_background | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, black_dress, enmaided, maid_headdress, solo, frills, looking_at_viewer, white_background, simple_background, ass, black_footwear, blush, garter_straps, juliet_sleeves, thighs, waist_apron, white_thighhighs | | 11 | 19 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, bare_shoulders, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, cleavage, solo, detached_collar, blush, wrist_cuffs, pantyhose, smile, covered_navel, highleg_leotard, purple_leotard, thighs, bowtie, fishnets, rabbit_tail, white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | fingerless_gloves | katana | looking_at_viewer | purple_bodysuit | rope | solo | tabard | holding_sword | japanese_armor | ribbed_sleeves | arm_guards | covered_navel | smile | loincloth | simple_background | white_background | blush | ribbed_sweater | turtleneck_sweater | huge_breasts | purple_sweater | armlet | collarbone | eyepatch_bikini | purple_bikini | bare_shoulders | choker | cleavage | thighs | navel | beads | blue_sky | single_glove | beach | day | nipples | completely_nude | sweat | 1boy | hetero | open_mouth | sex | mosaic_censoring | penis | pussy | vaginal | girl_on_top | spread_legs | purple_bra | purple_panties | underwear_only | lingerie | parted_lips | bow_panties | gradient_background | lace_trim | collared_shirt | dress_shirt | office_lady | white_shirt | alternate_costume | pencil_skirt | black_skirt | formal | jacket | long_sleeves | pantyhose | suit | obi | purple_kimono | floral_print | hair_flower | new_year | wide_sleeves | purple_dress | china_dress | earrings | gloves | feather_boa | folding_fan | holding_fan | side_slit | sitting | sleeveless | black_dress | enmaided | maid_headdress | frills | ass | black_footwear | garter_straps | juliet_sleeves | waist_apron | white_thighhighs | fake_animal_ears | playboy_bunny | rabbit_ears | detached_collar | wrist_cuffs | highleg_leotard | purple_leotard | bowtie | fishnets | rabbit_tail | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:--------------------|:---------|:--------------------|:------------------|:-------|:-------|:---------|:----------------|:-----------------|:-----------------|:-------------|:----------------|:--------|:------------|:--------------------|:-------------------|:--------|:-----------------|:---------------------|:---------------|:-----------------|:---------|:-------------|:------------------|:----------------|:-----------------|:---------|:-----------|:---------|:--------|:--------|:-----------|:---------------|:--------|:------|:----------|:------------------|:--------|:-------|:---------|:-------------|:------|:-------------------|:--------|:--------|:----------|:--------------|:--------------|:-------------|:-----------------|:-----------------|:-----------|:--------------|:--------------|:----------------------|:------------|:-----------------|:--------------|:--------------|:--------------|:--------------------|:---------------|:--------------|:---------|:---------|:---------------|:------------|:-------|:------|:----------------|:---------------|:--------------|:-----------|:---------------|:---------------|:--------------|:-----------|:---------|:--------------|:--------------|:--------------|:------------|:----------|:-------------|:--------------|:-----------|:-----------------|:---------|:------|:-----------------|:----------------|:-----------------|:--------------|:-------------------|:-------------------|:----------------|:--------------|:------------------|:--------------|:------------------|:-----------------|:---------|:-----------|:--------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | | X | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | X | X | X | X | X | | | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | | X | | | X | | | | | | | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 14 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | | X | | X | X | | | | | | | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | | X | | | X | | | | | | | | | | X | X | | | | | | X | | | | | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | | X | | | X | | | | | | | X | | X | | X | | | | | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | | X | | | X | | | | | | | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 20 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | | X | | | X | | | | | | | X | | X | X | X | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | | X | | | X | | | | | | | | | X | X | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 11 | 19 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | | | X | | | X | | | | | | X | X | | | X | X | | | | | | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
CyberHarem/minamoto_no_raikou_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T22:59:03+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T17:24:08+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of minamoto\_no\_raikou/源頼光/源赖光 (Fate/Grand Order) ========================================================== This is the dataset of minamoto\_no\_raikou/源頼光/源赖光 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'purple\_hair, long\_hair, breasts, purple\_eyes, very\_long\_hair, large\_breasts, bangs, parted\_bangs, low-tied\_long\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
431506424b6faafed9a9c34061126f374d58b7f2
# Dataset of kanna/カンナ (Pokémon) This is the dataset of kanna/カンナ (Pokémon), containing 323 images and their tags. The core tags of this character are `glasses, breasts, red_hair, long_hair, red_eyes, ponytail, large_breasts, bangs, sidelocks`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 323 | 289.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanna_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 323 | 179.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanna_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 692 | 345.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanna_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 323 | 261.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanna_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 692 | 462.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanna_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/kanna_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, smile, high_heels, jacket, pencil_skirt, holding_poke_ball, formal, looking_at_viewer, pantyhose, poke_ball_(basic), solo, ahoge, cleavage_cutout, pokemon_(creature), white_background, black_footwear, full_body, suit | | 1 | 24 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, ahoge, cleavage_cutout, long_sleeves, smile, solo, looking_at_viewer, pantyhose, black_jacket, sitting, black_skirt, closed_mouth | | 2 | 19 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, purple_skirt, looking_at_viewer, black_shirt, sleeveless_shirt, solo, smile, bracelet, orange_eyes, orange_hair, side_slit, bare_arms, closed_mouth, hand_up, holding_poke_ball, poke_ball_(basic), eyelashes, simple_background | | 3 | 36 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, smile, solo, thick_thighs, cleavage, beach, huge_breasts, outdoors, sky, cloud, day, ocean, miniskirt, sleeveless_shirt, armpits, sand, shore, sweat, black_shirt, curvy, muscular_female, arms_up, blush, arms_behind_head, purple_skirt | | 4 | 29 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | nude, 1girl, nipples, solo, navel, smile, looking_at_viewer, pussy, blush | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, hetero, nipples, penis, vaginal, 1boy, cum_in_pussy, blush, completely_nude, solo_focus, spread_legs, sweat, uncensored, sex_from_behind, straddling | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | high_heels | jacket | pencil_skirt | holding_poke_ball | formal | looking_at_viewer | pantyhose | poke_ball_(basic) | solo | ahoge | cleavage_cutout | pokemon_(creature) | white_background | black_footwear | full_body | suit | long_sleeves | black_jacket | sitting | black_skirt | closed_mouth | purple_skirt | black_shirt | sleeveless_shirt | bracelet | orange_eyes | orange_hair | side_slit | bare_arms | hand_up | eyelashes | simple_background | thick_thighs | cleavage | beach | huge_breasts | outdoors | sky | cloud | day | ocean | miniskirt | armpits | sand | shore | sweat | curvy | muscular_female | arms_up | blush | arms_behind_head | nude | nipples | navel | pussy | hetero | penis | vaginal | 1boy | cum_in_pussy | completely_nude | solo_focus | spread_legs | uncensored | sex_from_behind | straddling | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------------|:---------|:---------------|:--------------------|:---------|:--------------------|:------------|:--------------------|:-------|:--------|:------------------|:---------------------|:-------------------|:-----------------|:------------|:-------|:---------------|:---------------|:----------|:--------------|:---------------|:---------------|:--------------|:-------------------|:-----------|:--------------|:--------------|:------------|:------------|:----------|:------------|:--------------------|:---------------|:-----------|:--------|:---------------|:-----------|:------|:--------|:------|:--------|:------------|:----------|:-------|:--------|:--------|:--------|:------------------|:----------|:--------|:-------------------|:-------|:----------|:--------|:--------|:---------|:--------|:----------|:-------|:---------------|:------------------|:-------------|:--------------|:-------------|:------------------|:-------------| | 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 24 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | | | | X | X | | X | X | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 19 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | X | | X | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 36 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | | | X | | | X | | | | | | | | | | | | | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 4 | 29 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/kanna_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T23:02:01+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T19:51:55+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of kanna/カンナ (Pokémon) ============================== This is the dataset of kanna/カンナ (Pokémon), containing 323 images and their tags. The core tags of this character are 'glasses, breasts, red\_hair, long\_hair, red\_eyes, ponytail, large\_breasts, bangs, sidelocks', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
5b3948b82335e999190233d42b768543ef8c8e47
# transfer from d4rl dataset to minari dataset transfer scripts and validation are in transfer.py 1. clone the repo ``` $ git clone https://huggingface.co/datasets/im-Kitsch/minari_d4rl ``` 2. copy the file to minari root (default is ~/.minari) ``` mv minari_d4rl/datasets ~/.minari/datasets ``` # todo infos like `infos/qvel` are not saved since the interface is not stable yet and those infos cannot be read directly.
im-Kitsch/minari_d4rl
[ "task_categories:reinforcement-learning", "license:apache-2.0", "region:us" ]
2023-09-11T23:12:59+00:00
{"license": "apache-2.0", "task_categories": ["reinforcement-learning"]}
2023-09-13T12:21:04+00:00
[]
[]
TAGS #task_categories-reinforcement-learning #license-apache-2.0 #region-us
# transfer from d4rl dataset to minari dataset transfer scripts and validation are in URL 1. clone the repo 2. copy the file to minari root (default is ~/.minari) # todo infos like 'infos/qvel' are not saved since the interface is not stable yet and those infos cannot be read directly.
[ "# transfer from d4rl dataset to minari dataset\n\ntransfer scripts and validation are in URL\n\n1. clone the repo\n \n\n2. copy the file to minari root (default is ~/.minari)", "# todo \ninfos like 'infos/qvel' are not saved since the interface is not stable yet and those infos cannot be read directly." ]
[ "TAGS\n#task_categories-reinforcement-learning #license-apache-2.0 #region-us \n", "# transfer from d4rl dataset to minari dataset\n\ntransfer scripts and validation are in URL\n\n1. clone the repo\n \n\n2. copy the file to minari root (default is ~/.minari)", "# todo \ninfos like 'infos/qvel' are not saved since the interface is not stable yet and those infos cannot be read directly." ]
[ 26, 44, 32 ]
[ "passage: TAGS\n#task_categories-reinforcement-learning #license-apache-2.0 #region-us \n# transfer from d4rl dataset to minari dataset\n\ntransfer scripts and validation are in URL\n\n1. clone the repo\n \n\n2. copy the file to minari root (default is ~/.minari)# todo \ninfos like 'infos/qvel' are not saved since the interface is not stable yet and those infos cannot be read directly." ]
094e4ae1570b0e4bbfbe6cd332750464f259e8ae
# Dataset of hayami_kanade/速水奏 (THE iDOLM@STER: Cinderella Girls) This is the dataset of hayami_kanade/速水奏 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are `short_hair, yellow_eyes, bangs, parted_bangs, breasts, black_hair, blue_hair, medium_breasts, earrings`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 812.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayami_kanade_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 433.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayami_kanade_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1291 | 956.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayami_kanade_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 710.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayami_kanade_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1291 | 1.37 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hayami_kanade_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/hayami_kanade_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, elbow_gloves, looking_at_viewer, solo, cleavage, wedding_dress, black_dress, bouquet, bridal_veil, bare_shoulders, black_gloves, feathers, smile, hair_flower, jewelry | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, dress, elbow_gloves, hair_ornament, looking_at_viewer, smile, solo, wings, feathers, jewelry, cleavage, single_glove | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, belt, black_gloves, criss-cross_halter, elbow_gloves, jewelry, looking_at_viewer, navel, solo, cleavage, hair_ornament, smile, black_pants, collarbone, cowboy_shot, midriff, thigh_strap, black_choker, dress, fingerless_gloves, uneven_legwear, closed_mouth, lace_trim, parted_lips, single_pantsleg, single_thighhigh, stomach | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, black_bikini, cleavage, collarbone, looking_at_viewer, solo, halterneck, bare_shoulders, upper_body, armpits, parted_lips | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, armpits, bare_arms, bare_shoulders, blue_bikini, blush, cleavage, closed_mouth, collarbone, day, front-tie_bikini_top, halterneck, large_breasts, looking_at_viewer, navel, outdoors, sitting, smile, solo, stomach, string_bikini, thighs, water, wet, ocean, petals, side-tie_bikini_bottom, skindentation, sunlight, arm_up, rock, sky, arm_behind_head, arm_support, arms_behind_head, arms_up, tree | | 5 | 13 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, plaid_skirt, pleated_skirt, red_necktie, solo, looking_at_viewer, school_uniform, short_sleeves, smile, cleavage, collarbone, blush, necklace, collared_shirt, white_shirt, blue_shirt, grey_skirt, miniskirt, sitting | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, petals, ribbon, solo, bare_shoulders, hair_flower, looking_at_viewer, parted_lips, smile, collarbone, bracelet, cleavage, pink_dress, signature | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, black_shirt, blush, long_sleeves, looking_at_viewer, simple_background, smile, solo, black_pantyhose, grey_skirt, necklace, turtleneck, white_background, black_sweater, holding, gradient_background, sitting | | 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, cleavage, collarbone, large_breasts, looking_at_viewer, solo, armpits, bare_shoulders, smile, arms_up, bare_arms, navel, on_back, panties, thighs, arms_behind_head, babydoll, bed_sheet, closed_mouth, on_bed, lingerie, parted_lips, see-through, stomach | | 9 | 16 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, looking_at_viewer, maid_headdress, solo, enmaided, frills, long_sleeves, smile, jewelry, white_gloves, bow, holding, maid_apron, red_eyes, upper_body | | 10 | 14 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, braided_bangs, ribbon_braid, cleavage, hair_ribbon, jewelry, looking_at_viewer, solo, white_gloves, midriff, smile, detached_collar, navel, coat_on_shoulders, crop_top, necktie, belt, underboob, blue_ribbon, short_shorts, blush, highleg_panties, stomach, white_background, closed_mouth, cowboy_shot, hair_ornament, hand_up, simple_background, skindentation, thigh_strap, white_shorts, blue_thighhighs, buckle, buttons, star_(symbol), striped_thighhighs, upper_body, vertical_stripes | | 11 | 8 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1boy, 1girl, hetero, large_breasts, nipples, pov, solo_focus, blush, looking_at_viewer, penis, completely_nude, mosaic_censoring, sweat, collarbone, navel, open_mouth, sex, smile, vaginal, pussy, spread_legs, breasts_squeezed_together, cum_on_breasts, indoors, necklace, on_bed, paizuri, tongue_out | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | looking_at_viewer | solo | cleavage | wedding_dress | black_dress | bouquet | bridal_veil | bare_shoulders | black_gloves | feathers | smile | hair_flower | jewelry | dress | hair_ornament | wings | single_glove | belt | criss-cross_halter | navel | black_pants | collarbone | cowboy_shot | midriff | thigh_strap | black_choker | fingerless_gloves | uneven_legwear | closed_mouth | lace_trim | parted_lips | single_pantsleg | single_thighhigh | stomach | black_bikini | halterneck | upper_body | armpits | bare_arms | blue_bikini | blush | day | front-tie_bikini_top | large_breasts | outdoors | sitting | string_bikini | thighs | water | wet | ocean | petals | side-tie_bikini_bottom | skindentation | sunlight | arm_up | rock | sky | arm_behind_head | arm_support | arms_behind_head | arms_up | tree | plaid_skirt | pleated_skirt | red_necktie | school_uniform | short_sleeves | necklace | collared_shirt | white_shirt | blue_shirt | grey_skirt | miniskirt | ribbon | bracelet | pink_dress | signature | black_shirt | long_sleeves | simple_background | black_pantyhose | turtleneck | white_background | black_sweater | holding | gradient_background | on_back | panties | babydoll | bed_sheet | on_bed | lingerie | see-through | maid_headdress | enmaided | frills | white_gloves | bow | maid_apron | red_eyes | braided_bangs | ribbon_braid | hair_ribbon | detached_collar | coat_on_shoulders | crop_top | necktie | underboob | blue_ribbon | short_shorts | highleg_panties | hand_up | white_shorts | blue_thighhighs | buckle | buttons | star_(symbol) | striped_thighhighs | vertical_stripes | 1boy | hetero | nipples | pov | solo_focus | penis | completely_nude | mosaic_censoring | sweat | open_mouth | sex | vaginal | pussy | spread_legs | breasts_squeezed_together | cum_on_breasts | indoors | paizuri | tongue_out | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:--------------------|:-------|:-----------|:----------------|:--------------|:----------|:--------------|:-----------------|:---------------|:-----------|:--------|:--------------|:----------|:--------|:----------------|:--------|:---------------|:-------|:---------------------|:--------|:--------------|:-------------|:--------------|:----------|:--------------|:---------------|:--------------------|:-----------------|:---------------|:------------|:--------------|:------------------|:-------------------|:----------|:---------------|:-------------|:-------------|:----------|:------------|:--------------|:--------|:------|:-----------------------|:----------------|:-----------|:----------|:----------------|:---------|:--------|:------|:--------|:---------|:-------------------------|:----------------|:-----------|:---------|:-------|:------|:------------------|:--------------|:-------------------|:----------|:-------|:--------------|:----------------|:--------------|:-----------------|:----------------|:-----------|:-----------------|:--------------|:-------------|:-------------|:------------|:---------|:-----------|:-------------|:------------|:--------------|:---------------|:--------------------|:------------------|:-------------|:-------------------|:----------------|:----------|:----------------------|:----------|:----------|:-----------|:------------|:---------|:-----------|:--------------|:-----------------|:-----------|:---------|:---------------|:------|:-------------|:-----------|:----------------|:---------------|:--------------|:------------------|:--------------------|:-----------|:----------|:------------|:--------------|:---------------|:------------------|:----------|:---------------|:------------------|:---------|:----------|:----------------|:---------------------|:-------------------|:-------|:---------|:----------|:------|:-------------|:--------|:------------------|:-------------------|:--------|:-------------|:------|:----------|:--------|:--------------|:----------------------------|:-----------------|:----------|:----------|:-------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | | | | | | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | | | | | X | X | | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | X | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | X | | | | | X | | | X | | | | | | | | | X | | X | | | | | | | X | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 13 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | X | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | X | X | | | | | X | | | X | X | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | X | X | | | | | X | | | X | | | | | | | | | X | | X | | | | | | | X | | X | | | X | | | | X | X | | X | | | X | | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 16 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | X | X | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 14 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | X | X | X | | | | | | | | X | | X | | X | | | X | | X | | | X | X | X | | | | X | | | | | X | | | X | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 11 | 8 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | X | | | | | | | | | | X | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/hayami_kanade_idolmastercinderellagirls
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-11T23:20:59+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T12:13:23+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of hayami\_kanade/速水奏 (THE iDOLM@STER: Cinderella Girls) ================================================================ This is the dataset of hayami\_kanade/速水奏 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are 'short\_hair, yellow\_eyes, bangs, parted\_bangs, breasts, black\_hair, blue\_hair, medium\_breasts, earrings', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
fb50e6a3adf62907e8d650ea7580b180f2ec3ce3
# Dataset Card for "multi-class" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tingchih/multi-class
[ "region:us" ]
2023-09-11T23:25:48+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 98926083, "num_examples": 570999}, {"name": "test", "num_bytes": 42106324, "num_examples": 245116}], "download_size": 77717077, "dataset_size": 141032407}}
2023-09-12T03:21:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "multi-class" More Information needed
[ "# Dataset Card for \"multi-class\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"multi-class\"\n\nMore Information needed" ]
[ 6, 13 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"multi-class\"\n\nMore Information needed" ]
9bf3c17ec15bf456307e31b1961e365aa4933f60
# Dataset Card for "msrc_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
whateverweird17/msrc_data
[ "region:us" ]
2023-09-11T23:57:17+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1434434, "num_examples": 5801}, {"name": "validation", "num_bytes": 287084.61885881744, "num_examples": 1161}], "download_size": 923273, "dataset_size": 1721518.6188588175}}
2023-09-11T23:57:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for "msrc_data" More Information needed
[ "# Dataset Card for \"msrc_data\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"msrc_data\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"msrc_data\"\n\nMore Information needed" ]
ee56f608760c0b4a290c1aa36327a9e884e25431
# Dataset of ibaraki_douji/茨木童子/茨木童子 (Fate/Grand Order) This is the dataset of ibaraki_douji/茨木童子/茨木童子 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `horns, blonde_hair, facial_mark, oni_horns, long_hair, yellow_eyes, pointy_ears, breasts, earrings, small_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 734.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibaraki_douji_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 418.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibaraki_douji_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1181 | 863.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibaraki_douji_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 646.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ibaraki_douji_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1181 | 1.20 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ibaraki_douji_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/ibaraki_douji_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, jewelry, looking_at_viewer, oni, sharp_fingernails, solo, tattoo, wide_sleeves, yellow_kimono, bare_shoulders, forehead_mark, off_shoulder, beads, closed_mouth, holding_sword, long_sleeves, simple_background, white_background, collarbone, toenails, very_long_hair | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, off_shoulder, oni, solo, very_long_hair, yellow_kimono, jewelry, sharp_fingernails, tattoo, wide_sleeves, bare_shoulders, beads, smile | | 2 | 15 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, oni, solo, tattoo, yellow_kimono, looking_at_viewer, jewelry, off_shoulder, bare_shoulders, upper_body, slit_pupils | | 3 | 17 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, oni, solo, tattoo, yellow_kimono, looking_at_viewer, sharp_fingernails, fire, open_mouth, wide_sleeves, holding_sword, jewelry, teeth, bare_shoulders, beads, floral_print, :d, off_shoulder | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, collarbone, forehead_mark, long_sleeves, looking_at_viewer, off_shoulder, oni, simple_background, solo, tattoo, white_background, wide_sleeves, yellow_kimono, jewelry, sitting, very_long_hair, candy, floral_print, open_mouth, sash, sharp_fingernails, tongue_out | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, black_bikini, blue_kimono, forehead_mark, looking_at_viewer, oni, solo, tattoo, collarbone, jewelry, very_long_hair, navel, open_kimono, smile, front-tie_bikini_top, off_shoulder, wide_sleeves, closed_mouth, holding, sharp_fingernails, fang_out, floral_print, long_sleeves, sidelocks, simple_background, white_background | | 6 | 14 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, forehead_mark, hetero, oni, penis, solo_focus, 1boy, tattoo, blush, nipples, looking_at_viewer, nude, pussy, sharp_fingernails, open_mouth, sex, vaginal, navel, bar_censor, spread_legs, tongue_out, jewelry, lying, sweat | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | looking_at_viewer | oni | sharp_fingernails | solo | tattoo | wide_sleeves | yellow_kimono | bare_shoulders | forehead_mark | off_shoulder | beads | closed_mouth | holding_sword | long_sleeves | simple_background | white_background | collarbone | toenails | very_long_hair | smile | upper_body | slit_pupils | fire | open_mouth | teeth | floral_print | :d | sitting | candy | sash | tongue_out | black_bikini | blue_kimono | navel | open_kimono | front-tie_bikini_top | holding | fang_out | sidelocks | hetero | penis | solo_focus | 1boy | blush | nipples | nude | pussy | sex | vaginal | bar_censor | spread_legs | lying | sweat | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:--------------------|:------|:--------------------|:-------|:---------|:---------------|:----------------|:-----------------|:----------------|:---------------|:--------|:---------------|:----------------|:---------------|:--------------------|:-------------------|:-------------|:-----------|:-----------------|:--------|:-------------|:--------------|:-------|:-------------|:--------|:---------------|:-----|:----------|:--------|:-------|:-------------|:---------------|:--------------|:--------|:--------------|:-----------------------|:----------|:-----------|:------------|:---------|:--------|:-------------|:-------|:--------|:----------|:-------|:--------|:------|:----------|:-------------|:--------------|:--------|:--------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | X | | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 15 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | X | X | | X | X | | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 17 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | X | X | X | X | | X | X | | X | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | X | X | | X | | | | | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | X | X | X | X | X | | X | X | X | | X | | X | X | X | X | | X | X | | | | | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | 6 | 14 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | X | | X | | | | X | | | | | | | | | | | | | | | X | | | | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/ibaraki_douji_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T00:15:15+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T22:15:40+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of ibaraki\_douji/茨木童子/茨木童子 (Fate/Grand Order) ====================================================== This is the dataset of ibaraki\_douji/茨木童子/茨木童子 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'horns, blonde\_hair, facial\_mark, oni\_horns, long\_hair, yellow\_eyes, pointy\_ears, breasts, earrings, small\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
8f1c5c092a9518bcf0c8718a0f4a1a3ba144dfda
# Dataset of akane/アカネ (Pokémon) This is the dataset of akane/アカネ (Pokémon), containing 500 images and their tags. The core tags of this character are `pink_hair, breasts, twintails, pink_eyes, hair_ornament, hairclip, large_breasts, bangs, long_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 413.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 273.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1067 | 533.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 381.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1067 | 696.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akane_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/akane_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1boy, 1girl, hetero, nipples, nude, penis, blush, pov, solo_focus, cum, fellatio, looking_at_viewer, paizuri, huge_breasts, sweat, censored, heart-shaped_pupils | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, buttons, eyelashes, looking_at_viewer, open_mouth, smile, tongue, white_jacket, blue_shorts, short_sleeves, ;d, heart, one_eye_closed, pokemon_(creature), solo, shirt, short_shorts, wristband | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, open_mouth, solo, blush, smile, looking_at_viewer, heart, huge_breasts | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, nipples, open_shirt, solo, looking_at_viewer, open_mouth, breasts_out, buttons, navel, smile, collarbone, no_bra, shorts, simple_background | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blush, nipples, nude, solo, pussy, lactation, navel, open_mouth | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | :d, official_alternate_costume, open_mouth, tongue, 1girl, blush, christmas, eyelashes, gloves, red_headwear, santa_hat, brown_belt, dress, closed_eyes, detached_sleeves, pokemon_(creature), white_shorts | | 6 | 19 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, hetero, nipples, 1boy, penis, sex, vaginal, blush, solo_focus, open_mouth, nude, spread_legs, mosaic_censoring, cum_in_pussy, uncensored | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, cow_print, solo, collar, huge_breasts, blush, cow_horns, elbow_gloves, neck_bell, cow_ears, cow_tail, open_mouth, cowbell, thighhighs, areola_slip, cow_girl, looking_at_viewer, smile, sweat | | 8 | 10 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, cow_print, hetero, 1boy, blush, cow_horns, huge_breasts, nipples, cow_ears, cowbell, neck_bell, solo_focus, collar, fake_animal_ears, nude, open_mouth, heart, penis, bikini, elbow_gloves, paizuri, sex, simple_background, sweat, tongue, white_background | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1boy, 1girl, blush, penis, smile, bikini, cow_print, solo_focus, breasts_squeezed_together, open_mouth, cum_on_breasts, heart, mosaic_censoring, paizuri_under_clothes, sweat | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | nipples | nude | penis | blush | pov | solo_focus | cum | fellatio | looking_at_viewer | paizuri | huge_breasts | sweat | censored | heart-shaped_pupils | buttons | eyelashes | open_mouth | smile | tongue | white_jacket | blue_shorts | short_sleeves | ;d | heart | one_eye_closed | pokemon_(creature) | solo | shirt | short_shorts | wristband | open_shirt | breasts_out | navel | collarbone | no_bra | shorts | simple_background | pussy | lactation | :d | official_alternate_costume | christmas | gloves | red_headwear | santa_hat | brown_belt | dress | closed_eyes | detached_sleeves | white_shorts | sex | vaginal | spread_legs | mosaic_censoring | cum_in_pussy | uncensored | cow_print | collar | cow_horns | elbow_gloves | neck_bell | cow_ears | cow_tail | cowbell | thighhighs | areola_slip | cow_girl | fake_animal_ears | bikini | white_background | breasts_squeezed_together | cum_on_breasts | paizuri_under_clothes | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:----------|:-------|:--------|:--------|:------|:-------------|:------|:-----------|:--------------------|:----------|:---------------|:--------|:-----------|:----------------------|:----------|:------------|:-------------|:--------|:---------|:---------------|:--------------|:----------------|:-----|:--------|:-----------------|:---------------------|:-------|:--------|:---------------|:------------|:-------------|:--------------|:--------|:-------------|:---------|:---------|:--------------------|:--------|:------------|:-----|:-----------------------------|:------------|:---------|:---------------|:------------|:-------------|:--------|:--------------|:-------------------|:---------------|:------|:----------|:--------------|:-------------------|:---------------|:-------------|:------------|:---------|:------------|:---------------|:------------|:-----------|:-----------|:----------|:-------------|:--------------|:-----------|:-------------------|:---------|:-------------------|:----------------------------|:-----------------|:------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | X | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | | X | | | | | X | | | | | X | | X | | | | | | X | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | | X | | X | | | X | | | | | X | | | | | | X | | X | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | | X | | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | | X | | | | | X | | | | | | | | | | | | X | X | | X | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 19 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | X | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | | X | | | | | X | | | | | X | | X | X | | | | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | 8 | 10 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | X | X | X | X | X | | X | | | | X | X | X | | | | | X | | X | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | | X | | | | X | X | X | | | | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | | | X | X | | X | | | | | | X | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | X | | X | X | X |
CyberHarem/akane_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T00:21:29+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T20:48:56+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of akane/アカネ (Pokémon) ============================== This is the dataset of akane/アカネ (Pokémon), containing 500 images and their tags. The core tags of this character are 'pink\_hair, breasts, twintails, pink\_eyes, hair\_ornament, hairclip, large\_breasts, bangs, long\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
7b04b8c63d6974cc9444a60d5152919da4774841
# Dataset of odamaki_sapphire (Pokémon) This is the dataset of odamaki_sapphire (Pokémon), containing 19 images and their tags. The core tags of this character are `brown_hair, blue_eyes, breasts, large_breasts, fang`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 19 | 18.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odamaki_sapphire_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 19 | 10.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odamaki_sapphire_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 42 | 22.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odamaki_sapphire_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 19 | 16.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odamaki_sapphire_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 42 | 29.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/odamaki_sapphire_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/odamaki_sapphire_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------| | 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, open_mouth, solo, navel, scar, swimsuit | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | solo | navel | scar | swimsuit | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-------|:--------|:-------|:-----------| | 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X |
CyberHarem/odamaki_sapphire_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T00:29:05+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T19:30:08+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of odamaki\_sapphire (Pokémon) ====================================== This is the dataset of odamaki\_sapphire (Pokémon), containing 19 images and their tags. The core tags of this character are 'brown\_hair, blue\_eyes, breasts, large\_breasts, fang', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
3d8b4011792a978b74dd4d5c6c086c6e20789276
# Dataset Card for "guanaco-llama2-chinese-1k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
elliotthwang/guanaco-llama2-chinese-1k
[ "region:us" ]
2023-09-12T01:00:55+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1348677, "num_examples": 1000}], "download_size": 0, "dataset_size": 1348677}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-13T00:47:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for "guanaco-llama2-chinese-1k" More Information needed
[ "# Dataset Card for \"guanaco-llama2-chinese-1k\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"guanaco-llama2-chinese-1k\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"guanaco-llama2-chinese-1k\"\n\nMore Information needed" ]
7f2a53b684e2d133050fc15fb2237ef558f9bb7d
# Dataset Card for Evaluation run of quantumaikr/llama-2-70B-chat ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/quantumaikr/llama-2-70B-chat - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [quantumaikr/llama-2-70B-chat](https://huggingface.co/quantumaikr/llama-2-70B-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-12T02:16:02.997699](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat/blob/main/results_2023-09-12T02-16-02.997699.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6907220885493215, "acc_stderr": 0.03128215576907498, "acc_norm": 0.6945100749974468, "acc_norm_stderr": 0.03125451504899436, "mc1": 0.40269277845777235, "mc1_stderr": 0.01716883093518722, "mc2": 0.5731438157700953, "mc2_stderr": 0.014660092779943103 }, "harness|arc:challenge|25": { "acc": 0.6407849829351536, "acc_stderr": 0.014020224155839157, "acc_norm": 0.6757679180887372, "acc_norm_stderr": 0.01367881039951882 }, "harness|hellaswag|10": { "acc": 0.6809400517825135, "acc_stderr": 0.0046515972099930875, "acc_norm": 0.869448317068313, "acc_norm_stderr": 0.003362208481557298 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6, "acc_stderr": 0.04232073695151589, "acc_norm": 0.6, "acc_norm_stderr": 0.04232073695151589 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7960526315789473, "acc_stderr": 0.0327900040631005, "acc_norm": 0.7960526315789473, "acc_norm_stderr": 0.0327900040631005 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8194444444444444, "acc_stderr": 0.032166008088022675, "acc_norm": 0.8194444444444444, "acc_norm_stderr": 0.032166008088022675 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6473988439306358, "acc_stderr": 0.036430371689585475, "acc_norm": 0.6473988439306358, "acc_norm_stderr": 0.036430371689585475 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.30392156862745096, "acc_stderr": 0.045766654032077636, "acc_norm": 0.30392156862745096, "acc_norm_stderr": 0.045766654032077636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6340425531914894, "acc_stderr": 0.0314895582974553, "acc_norm": 0.6340425531914894, "acc_norm_stderr": 0.0314895582974553 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.455026455026455, "acc_stderr": 0.025646928361049398, "acc_norm": 0.455026455026455, "acc_norm_stderr": 0.025646928361049398 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5555555555555556, "acc_stderr": 0.044444444444444495, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188716, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188716 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5467980295566502, "acc_stderr": 0.03502544650845872, "acc_norm": 0.5467980295566502, "acc_norm_stderr": 0.03502544650845872 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.77, "acc_stderr": 0.04229525846816505, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8545454545454545, "acc_stderr": 0.027530196355066584, "acc_norm": 0.8545454545454545, "acc_norm_stderr": 0.027530196355066584 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8585858585858586, "acc_stderr": 0.02482590979334333, "acc_norm": 0.8585858585858586, "acc_norm_stderr": 0.02482590979334333 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9430051813471503, "acc_stderr": 0.01673108529360755, "acc_norm": 0.9430051813471503, "acc_norm_stderr": 0.01673108529360755 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7025641025641025, "acc_stderr": 0.023177408131465942, "acc_norm": 0.7025641025641025, "acc_norm_stderr": 0.023177408131465942 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.027309140588230172, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.027309140588230172 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7647058823529411, "acc_stderr": 0.02755361446786381, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.02755361446786381 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.4966887417218543, "acc_stderr": 0.04082393379449654, "acc_norm": 0.4966887417218543, "acc_norm_stderr": 0.04082393379449654 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8917431192660551, "acc_stderr": 0.01332134844761176, "acc_norm": 0.8917431192660551, "acc_norm_stderr": 0.01332134844761176 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5972222222222222, "acc_stderr": 0.03344887382997866, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.03344887382997866 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9117647058823529, "acc_stderr": 0.01990739979131695, "acc_norm": 0.9117647058823529, "acc_norm_stderr": 0.01990739979131695 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9071729957805907, "acc_stderr": 0.01888975055095671, "acc_norm": 0.9071729957805907, "acc_norm_stderr": 0.01888975055095671 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7937219730941704, "acc_stderr": 0.02715715047956382, "acc_norm": 0.7937219730941704, "acc_norm_stderr": 0.02715715047956382 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.0364129708131373, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.0364129708131373 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8925619834710744, "acc_stderr": 0.028268812192540637, "acc_norm": 0.8925619834710744, "acc_norm_stderr": 0.028268812192540637 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8159509202453987, "acc_stderr": 0.030446777687971726, "acc_norm": 0.8159509202453987, "acc_norm_stderr": 0.030446777687971726 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5982142857142857, "acc_stderr": 0.04653333146973647, "acc_norm": 0.5982142857142857, "acc_norm_stderr": 0.04653333146973647 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573975, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573975 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9145299145299145, "acc_stderr": 0.01831589168562585, "acc_norm": 0.9145299145299145, "acc_norm_stderr": 0.01831589168562585 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.859514687100894, "acc_stderr": 0.012426211353093448, "acc_norm": 0.859514687100894, "acc_norm_stderr": 0.012426211353093448 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7572254335260116, "acc_stderr": 0.023083658586984204, "acc_norm": 0.7572254335260116, "acc_norm_stderr": 0.023083658586984204 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5251396648044693, "acc_stderr": 0.01670135084268263, "acc_norm": 0.5251396648044693, "acc_norm_stderr": 0.01670135084268263 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.025058503316958154, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.025058503316958154 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7588424437299035, "acc_stderr": 0.024296594034763426, "acc_norm": 0.7588424437299035, "acc_norm_stderr": 0.024296594034763426 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8271604938271605, "acc_stderr": 0.02103851777015737, "acc_norm": 0.8271604938271605, "acc_norm_stderr": 0.02103851777015737 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.574468085106383, "acc_stderr": 0.02949482760014437, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.02949482760014437 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5664928292046936, "acc_stderr": 0.01265681038398397, "acc_norm": 0.5664928292046936, "acc_norm_stderr": 0.01265681038398397 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7095588235294118, "acc_stderr": 0.027576468622740533, "acc_norm": 0.7095588235294118, "acc_norm_stderr": 0.027576468622740533 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7581699346405228, "acc_stderr": 0.017322789207784326, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.017322789207784326 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7454545454545455, "acc_stderr": 0.041723430387053825, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.041723430387053825 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7673469387755102, "acc_stderr": 0.02704925791589618, "acc_norm": 0.7673469387755102, "acc_norm_stderr": 0.02704925791589618 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8706467661691543, "acc_stderr": 0.023729830881018515, "acc_norm": 0.8706467661691543, "acc_norm_stderr": 0.023729830881018515 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.40269277845777235, "mc1_stderr": 0.01716883093518722, "mc2": 0.5731438157700953, "mc2_stderr": 0.014660092779943103 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat
[ "region:us" ]
2023-09-12T01:16:18+00:00
{"pretty_name": "Evaluation run of quantumaikr/llama-2-70B-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [quantumaikr/llama-2-70B-chat](https://huggingface.co/quantumaikr/llama-2-70B-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-12T02:16:02.997699](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__llama-2-70B-chat/blob/main/results_2023-09-12T02-16-02.997699.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6907220885493215,\n \"acc_stderr\": 0.03128215576907498,\n \"acc_norm\": 0.6945100749974468,\n \"acc_norm_stderr\": 0.03125451504899436,\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5731438157700953,\n \"mc2_stderr\": 0.014660092779943103\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839157,\n \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.01367881039951882\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6809400517825135,\n \"acc_stderr\": 0.0046515972099930875,\n \"acc_norm\": 0.869448317068313,\n \"acc_norm_stderr\": 0.003362208481557298\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n \"acc_stderr\": 0.0314895582974553,\n \"acc_norm\": 0.6340425531914894,\n \"acc_norm_stderr\": 0.0314895582974553\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334333,\n \"acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334333\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230172,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230172\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8917431192660551,\n \"acc_stderr\": 0.01332134844761176,\n \"acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.01332134844761176\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.01990739979131695,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.01990739979131695\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573975,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573975\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.859514687100894,\n \"acc_stderr\": 0.012426211353093448,\n \"acc_norm\": 0.859514687100894,\n \"acc_norm_stderr\": 0.012426211353093448\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5251396648044693,\n \"acc_stderr\": 0.01670135084268263,\n \"acc_norm\": 0.5251396648044693,\n \"acc_norm_stderr\": 0.01670135084268263\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958154,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958154\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.7588424437299035,\n \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.02103851777015737,\n \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.02103851777015737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014437,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014437\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5664928292046936,\n \"acc_stderr\": 0.01265681038398397,\n \"acc_norm\": 0.5664928292046936,\n \"acc_norm_stderr\": 0.01265681038398397\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.027576468622740533,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.027576468622740533\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.017322789207784326,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.017322789207784326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018515,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018515\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40269277845777235,\n \"mc1_stderr\": 0.01716883093518722,\n \"mc2\": 0.5731438157700953,\n \"mc2_stderr\": 0.014660092779943103\n }\n}\n```", "repo_url": "https://huggingface.co/quantumaikr/llama-2-70B-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|arc:challenge|25_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hellaswag|10_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T02-16-02.997699.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T02-16-02.997699.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T02_16_02.997699", "path": ["results_2023-09-12T02-16-02.997699.parquet"]}, {"split": "latest", "path": ["results_2023-09-12T02-16-02.997699.parquet"]}]}]}
2023-09-12T01:17:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of quantumaikr/llama-2-70B-chat ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model quantumaikr/llama-2-70B-chat on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-12T02:16:02.997699(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of quantumaikr/llama-2-70B-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/llama-2-70B-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-12T02:16:02.997699(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of quantumaikr/llama-2-70B-chat", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/llama-2-70B-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-12T02:16:02.997699(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 20, 31, 168, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of quantumaikr/llama-2-70B-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model quantumaikr/llama-2-70B-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-12T02:16:02.997699(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
173b7062ba18f473106354911fb3ac75864ce45d
Dataset for paper: FinVis-GPT: A Multimodal Large Language Model for Financial Chart Analysis( https://github.com/wwwadx/FinVis-GPT ) The .zip file contains images
wza/FinVis
[ "license:apache-2.0", "region:us" ]
2023-09-12T01:18:20+00:00
{"license": "apache-2.0"}
2023-09-14T00:52:51+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
Dataset for paper: FinVis-GPT: A Multimodal Large Language Model for Financial Chart Analysis( URL ) The .zip file contains images
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n" ]
ffbf4fe6d496f67a4f7cfd30cdaa0d94a336c0a5
# Dataset Card for "c4_biomedical_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zxvix/c4_biomedical_2
[ "region:us" ]
2023-09-12T01:18:51+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "timestamp", "dtype": "timestamp[s]"}, {"name": "url", "dtype": "string"}, {"name": "original_text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3516783.122, "num_examples": 989}], "download_size": 2179356, "dataset_size": 3516783.122}}
2023-09-12T02:10:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for "c4_biomedical_2" More Information needed
[ "# Dataset Card for \"c4_biomedical_2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"c4_biomedical_2\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"c4_biomedical_2\"\n\nMore Information needed" ]
a77d71cf64c7984760c3e3262ac2d587ec6b901d
# Dataset of shirasaka_koume/白坂小梅 (THE iDOLM@STER: Cinderella Girls) This is the dataset of shirasaka_koume/白坂小梅 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are `hair_over_one_eye, blonde_hair, short_hair, earrings, red_eyes, ear_piercing, brown_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 560.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 333.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1147 | 687.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 502.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1147 | 957.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/shirasaka_koume_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, choker, hair_flower, jewelry, solo, bags_under_eyes, hair_bow, skull, kimono, looking_at_viewer, ribbon, smile, bare_shoulders, blush, obi, dress, frills, lipstick, lolita_fashion, microphone, open_mouth | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bandages, bare_shoulders, detached_sleeves, sleeves_past_wrists, solo, bags_under_eyes, eyeball, looking_at_viewer, halloween, jack-o'-lantern, jewelry, blush, open_mouth, smile, spider_web_print, black_dress, ghost, ribbon, moon, pumpkin_hair_ornament | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, hood_down, hoodie, looking_at_viewer, bags_under_eyes, solo, sleeves_past_fingers, jewelry, skirt, blush, open_mouth, simple_background | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, hood_down, looking_at_viewer, piercing, simple_background, solo, white_background, blush, jewelry, sleeves_past_fingers, black_hoodie, blood, collarbone, long_sleeves, smile, closed_mouth, open_mouth, upper_body | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, hood_down, long_sleeves, looking_at_viewer, plaid_skirt, pleated_skirt, sleeves_past_fingers, solo, blush, jewelry, piercing, red_skirt, smile, black_hoodie, blood, simple_background, white_background, black_pantyhose, closed_mouth | | 5 | 12 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bags_under_eyes, blush, looking_at_viewer, solo, apron, maid_headdress, open_mouth, sleeves_past_fingers, jewelry, skull, smile, enmaided | | 6 | 17 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, solo, blush, jewelry, looking_at_viewer, smile, bags_under_eyes, dress, hairclip, open_mouth, piercing, skirt, skull, mini_top_hat, striped_thighhighs, microphone, sleeves_past_fingers | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, hat, looking_at_viewer, sleeves_past_wrists, solo, striped_sleeves, bags_under_eyes, choker, necklace, skirt, smile, bespectacled, ring, open_mouth, skull_print | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, hair_bow, looking_at_viewer, smile, solo, blush, white_background, bare_shoulders, open_mouth, piercing, simple_background, bridal_gauntlets, nail_polish, necklace, purple_bow, purple_dress, ribbon, rose, skull, upper_body | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, blush, piercing, pleated_skirt, white_shirt, bangs, red_neckerchief, sailor_collar, serafuku, short_sleeves, simple_background, solo, black_skirt, looking_at_viewer, thick_thighs, white_background, white_thighhighs, open_mouth | | 10 | 7 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1boy, 1girl, blush, hetero, shiny_hair, shiny_skin, black_hoodie, solo_focus, piercing, anus, ass_grab, deep_skin, from_behind, hood_down, looking_back, sleeves_past_wrists, thighhighs, bar_censor, black_panties, cum, nipples, oral, penis, red_skirt, small_breasts, thighs, white_panties | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | choker | hair_flower | jewelry | solo | bags_under_eyes | hair_bow | skull | kimono | looking_at_viewer | ribbon | smile | bare_shoulders | blush | obi | dress | frills | lipstick | lolita_fashion | microphone | open_mouth | bandages | detached_sleeves | sleeves_past_wrists | eyeball | halloween | jack-o'-lantern | spider_web_print | black_dress | ghost | moon | pumpkin_hair_ornament | hood_down | hoodie | sleeves_past_fingers | skirt | simple_background | piercing | white_background | black_hoodie | blood | collarbone | long_sleeves | closed_mouth | upper_body | plaid_skirt | pleated_skirt | red_skirt | black_pantyhose | apron | maid_headdress | enmaided | hairclip | mini_top_hat | striped_thighhighs | hat | striped_sleeves | necklace | bespectacled | ring | skull_print | bridal_gauntlets | nail_polish | purple_bow | purple_dress | rose | white_shirt | bangs | red_neckerchief | sailor_collar | serafuku | short_sleeves | black_skirt | thick_thighs | white_thighhighs | 1boy | hetero | shiny_hair | shiny_skin | solo_focus | anus | ass_grab | deep_skin | from_behind | looking_back | thighhighs | bar_censor | black_panties | cum | nipples | oral | penis | small_breasts | thighs | white_panties | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------|:--------------|:----------|:-------|:------------------|:-----------|:--------|:---------|:--------------------|:---------|:--------|:-----------------|:--------|:------|:--------|:---------|:-----------|:-----------------|:-------------|:-------------|:-----------|:-------------------|:----------------------|:----------|:------------|:------------------|:-------------------|:--------------|:--------|:-------|:------------------------|:------------|:---------|:-----------------------|:--------|:--------------------|:-----------|:-------------------|:---------------|:--------|:-------------|:---------------|:---------------|:-------------|:--------------|:----------------|:------------|:------------------|:--------|:-----------------|:-----------|:-----------|:---------------|:---------------------|:------|:------------------|:-----------|:---------------|:-------|:--------------|:-------------------|:--------------|:-------------|:---------------|:-------|:--------------|:--------|:------------------|:----------------|:-----------|:----------------|:--------------|:---------------|:-------------------|:-------|:---------|:-------------|:-------------|:-------------|:-------|:-----------|:------------|:--------------|:---------------|:-------------|:-------------|:----------------|:------|:----------|:-------|:--------|:----------------|:---------|:----------------| | 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | X | | | | X | X | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 14 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | X | X | | | | X | | | | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | X | | | | | X | | X | | X | | | | | | | X | | | | | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | X | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | X | | X | | X | X | X | X | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 12 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | X | X | | X | | X | | X | | X | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 17 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | X | X | | X | | X | | X | | X | | X | | | | X | X | | | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | X | X | | | | X | | X | | X | | | | | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | | X | | X | X | | X | X | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | | | | | | X | | | | | | | | | | | | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | | X | | | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 10 | 7 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | X | | | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/shirasaka_koume_idolmastercinderellagirls
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T01:18:59+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T09:14:48+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of shirasaka\_koume/白坂小梅 (THE iDOLM@STER: Cinderella Girls) =================================================================== This is the dataset of shirasaka\_koume/白坂小梅 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are 'hair\_over\_one\_eye, blonde\_hair, short\_hair, earrings, red\_eyes, ear\_piercing, brown\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
bef5be551ac506475428bc0b9a3694a341a51243
# Dataset Card for MNBVC ## Table of Contents - [Dataset Card for MNBVC](#dataset-card-for-mnbvc) - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [数据集介绍](#数据集介绍) - [数据子集](#数据子集) - [数据格式](#数据格式) - [文本数据](#文本数据) - [问答数据](#问答数据) - [Contributions](#contributions) ## Dataset Description - **Homepage:** http://mnbvc.253874.net/ - **Repository:** https://github.com/esbatmop/MNBVC - **Paper:** N/A - **Leaderboard:** N/A - **Point of Contact:** N/A ### 数据集介绍 中文互联网上最古老最神秘(没有之一)的里屋社区于2023.1.1庄重宣布: 在英明神武的里屋管子带领下,决心发挥社区所长(哪都长),帮助开源社区长期更新一份最大的中文互联网语料集。 Huggingface上的MNBVC数据集在逐渐更新中,请到[https://github.com/esbatmop/MNBVC](https://github.com/esbatmop/MNBVC) 获取未完成清洗的更多数据。 可以使用如下脚本加载: ```python from datasets import load_dataset dataset = load_dataset("liwu/MNBVC", 'law_judgement', split='train', streaming=True) next(iter(dataset)) # get the first line ``` ## 数据子集 MNBVC数据集包含数个子集: - `law_judgement`: 来自法律文书的文本。 - `gov_xuexiqiangguo`: 来自学习强国的文本。 - `gov_report`: 来自政府工作报告的文本。 - `co_ann_report`: 企业年报文本。 - `code_metadata`: 代码元数据。 - `qa_zhihu`: 来自知乎的问答数据。 - `qa_wikihow`: 来自wikihow的问答数据。 - `qa_mfa`: 外交部问答数据。 - `news_peoples_daily`: 来自人民日报的文本数据。 - `wikipedia`: 来自维基百科的文本数据。 ## 数据格式 目前MNBVC数据集包含如下几类数据: ### 文本数据 文本数据使用如下格式组织: ```json { "文件名": datasets.Value("string"), "是否待查文件": datasets.Value("bool"), "是否重复文件": datasets.Value("bool"), "文件大小": datasets.Value("int32"), "simhash": datasets.Value("uint64"), "最长段落长度": datasets.Value("int32"), "段落数": datasets.Value("int32"), "去重段落数": datasets.Value("int32"), "低质量段落数": datasets.Value("int32"), "段落": [ datasets.Features( { "行号": datasets.Value("int32"), "是否重复": datasets.Value("bool"), "是否跨文件重复": datasets.Value("bool"), "md5": datasets.Value("string"), "内容": datasets.Value("string"), } ) ] } ``` ### 问答数据 问答数据使用如下格式组织: ```json { "id": datasets.Value("int32"), "问": datasets.Value("string"), "答": datasets.Value("string"), "来源": datasets.Value("string"), "元数据": { "create_time": datasets.Value("string"), "问题明细": datasets.Value("string"), "回答明细": datasets.Value("string"), "扩展字段": datasets.Value("string"), } } ``` 项目早期所上传的数据使用如下格式,以后这一格式会被废弃,相应数据也会重新上传: ```json { "text": datasets.Value("string"), "meta": datasets.Value("string") } ``` ### Contributions Thanks to the [Liwu community](http://mnbvc.253874.net/) for constructing this dataset. Thanks to [silver](https://github.com/silverriver) for adding this dataset.
botp/liwu-MNBVC
[ "task_categories:text-generation", "task_categories:fill-mask", "task_ids:language-modeling", "task_ids:masked-language-modeling", "annotations_creators:other", "language_creators:other", "multilinguality:monolingual", "size_categories:unknown", "source_datasets:original", "language:zh", "license:mit", "region:us" ]
2023-09-12T01:35:33+00:00
{"annotations_creators": ["other"], "language_creators": ["other"], "language": ["zh"], "license": ["mit"], "multilinguality": ["monolingual"], "size_categories": ["unknown"], "source_datasets": ["original"], "task_categories": ["text-generation", "fill-mask"], "task_ids": ["language-modeling", "masked-language-modeling"], "pretty_name": "MNBVC", "duplicated_from": "liwu/MNBVC"}
2023-09-12T01:35:34+00:00
[]
[ "zh" ]
TAGS #task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-other #language_creators-other #multilinguality-monolingual #size_categories-unknown #source_datasets-original #language-Chinese #license-mit #region-us
# Dataset Card for MNBVC ## Table of Contents - Dataset Card for MNBVC - Table of Contents - Dataset Description - 数据集介绍 - 数据子集 - 数据格式 - 文本数据 - 问答数据 - Contributions ## Dataset Description - Homepage: URL - Repository: URL - Paper: N/A - Leaderboard: N/A - Point of Contact: N/A ### 数据集介绍 中文互联网上最古老最神秘(没有之一)的里屋社区于2023.1.1庄重宣布: 在英明神武的里屋管子带领下,决心发挥社区所长(哪都长),帮助开源社区长期更新一份最大的中文互联网语料集。 Huggingface上的MNBVC数据集在逐渐更新中,请到https://URL 获取未完成清洗的更多数据。 可以使用如下脚本加载: ## 数据子集 MNBVC数据集包含数个子集: - 'law_judgement': 来自法律文书的文本。 - 'gov_xuexiqiangguo': 来自学习强国的文本。 - 'gov_report': 来自政府工作报告的文本。 - 'co_ann_report': 企业年报文本。 - 'code_metadata': 代码元数据。 - 'qa_zhihu': 来自知乎的问答数据。 - 'qa_wikihow': 来自wikihow的问答数据。 - 'qa_mfa': 外交部问答数据。 - 'news_peoples_daily': 来自人民日报的文本数据。 - 'wikipedia': 来自维基百科的文本数据。 ## 数据格式 目前MNBVC数据集包含如下几类数据: ### 文本数据 文本数据使用如下格式组织: ### 问答数据 问答数据使用如下格式组织: 项目早期所上传的数据使用如下格式,以后这一格式会被废弃,相应数据也会重新上传: ### Contributions Thanks to the Liwu community for constructing this dataset. Thanks to silver for adding this dataset.
[ "# Dataset Card for MNBVC", "## Table of Contents\n- Dataset Card for MNBVC\n - Table of Contents\n - Dataset Description\n - 数据集介绍\n - 数据子集\n - 数据格式\n - 文本数据\n - 问答数据\n - Contributions", "## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: N/A\n- Leaderboard: N/A\n- Point of Contact: N/A", "### 数据集介绍\n\n中文互联网上最古老最神秘(没有之一)的里屋社区于2023.1.1庄重宣布:\n\n在英明神武的里屋管子带领下,决心发挥社区所长(哪都长),帮助开源社区长期更新一份最大的中文互联网语料集。\n\nHuggingface上的MNBVC数据集在逐渐更新中,请到https://URL 获取未完成清洗的更多数据。\n\n可以使用如下脚本加载:", "## 数据子集\n\nMNBVC数据集包含数个子集:\n\n- 'law_judgement': 来自法律文书的文本。\n- 'gov_xuexiqiangguo': 来自学习强国的文本。\n- 'gov_report': 来自政府工作报告的文本。\n- 'co_ann_report': 企业年报文本。\n- 'code_metadata': 代码元数据。\n- 'qa_zhihu': 来自知乎的问答数据。\n- 'qa_wikihow': 来自wikihow的问答数据。\n- 'qa_mfa': 外交部问答数据。\n- 'news_peoples_daily': 来自人民日报的文本数据。\n- 'wikipedia': 来自维基百科的文本数据。", "## 数据格式\n\n目前MNBVC数据集包含如下几类数据:", "### 文本数据\n文本数据使用如下格式组织:", "### 问答数据\n问答数据使用如下格式组织:\n\n\n项目早期所上传的数据使用如下格式,以后这一格式会被废弃,相应数据也会重新上传:", "### Contributions\n\nThanks to the Liwu community for constructing this dataset.\nThanks to silver for adding this dataset." ]
[ "TAGS\n#task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-other #language_creators-other #multilinguality-monolingual #size_categories-unknown #source_datasets-original #language-Chinese #license-mit #region-us \n", "# Dataset Card for MNBVC", "## Table of Contents\n- Dataset Card for MNBVC\n - Table of Contents\n - Dataset Description\n - 数据集介绍\n - 数据子集\n - 数据格式\n - 文本数据\n - 问答数据\n - Contributions", "## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: N/A\n- Leaderboard: N/A\n- Point of Contact: N/A", "### 数据集介绍\n\n中文互联网上最古老最神秘(没有之一)的里屋社区于2023.1.1庄重宣布:\n\n在英明神武的里屋管子带领下,决心发挥社区所长(哪都长),帮助开源社区长期更新一份最大的中文互联网语料集。\n\nHuggingface上的MNBVC数据集在逐渐更新中,请到https://URL 获取未完成清洗的更多数据。\n\n可以使用如下脚本加载:", "## 数据子集\n\nMNBVC数据集包含数个子集:\n\n- 'law_judgement': 来自法律文书的文本。\n- 'gov_xuexiqiangguo': 来自学习强国的文本。\n- 'gov_report': 来自政府工作报告的文本。\n- 'co_ann_report': 企业年报文本。\n- 'code_metadata': 代码元数据。\n- 'qa_zhihu': 来自知乎的问答数据。\n- 'qa_wikihow': 来自wikihow的问答数据。\n- 'qa_mfa': 外交部问答数据。\n- 'news_peoples_daily': 来自人民日报的文本数据。\n- 'wikipedia': 来自维基百科的文本数据。", "## 数据格式\n\n目前MNBVC数据集包含如下几类数据:", "### 文本数据\n文本数据使用如下格式组织:", "### 问答数据\n问答数据使用如下格式组织:\n\n\n项目早期所上传的数据使用如下格式,以后这一格式会被废弃,相应数据也会重新上传:", "### Contributions\n\nThanks to the Liwu community for constructing this dataset.\nThanks to silver for adding this dataset." ]
[ 104, 8, 49, 35, 104, 174, 16, 13, 38, 27 ]
[ "passage: TAGS\n#task_categories-text-generation #task_categories-fill-mask #task_ids-language-modeling #task_ids-masked-language-modeling #annotations_creators-other #language_creators-other #multilinguality-monolingual #size_categories-unknown #source_datasets-original #language-Chinese #license-mit #region-us \n# Dataset Card for MNBVC## Table of Contents\n- Dataset Card for MNBVC\n - Table of Contents\n - Dataset Description\n - 数据集介绍\n - 数据子集\n - 数据格式\n - 文本数据\n - 问答数据\n - Contributions## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: N/A\n- Leaderboard: N/A\n- Point of Contact: N/A### 数据集介绍\n\n中文互联网上最古老最神秘(没有之一)的里屋社区于2023.1.1庄重宣布:\n\n在英明神武的里屋管子带领下,决心发挥社区所长(哪都长),帮助开源社区长期更新一份最大的中文互联网语料集。\n\nHuggingface上的MNBVC数据集在逐渐更新中,请到https://URL 获取未完成清洗的更多数据。\n\n可以使用如下脚本加载:## 数据子集\n\nMNBVC数据集包含数个子集:\n\n- 'law_judgement': 来自法律文书的文本。\n- 'gov_xuexiqiangguo': 来自学习强国的文本。\n- 'gov_report': 来自政府工作报告的文本。\n- 'co_ann_report': 企业年报文本。\n- 'code_metadata': 代码元数据。\n- 'qa_zhihu': 来自知乎的问答数据。\n- 'qa_wikihow': 来自wikihow的问答数据。\n- 'qa_mfa': 外交部问答数据。\n- 'news_peoples_daily': 来自人民日报的文本数据。\n- 'wikipedia': 来自维基百科的文本数据。## 数据格式\n\n目前MNBVC数据集包含如下几类数据:### 文本数据\n文本数据使用如下格式组织:" ]
b193cb5c80201b3557a954638a9d1b664c068c1e
# Dataset of nitocris/ニトクリス/尼托克丽丝 (Fate/Grand Order) This is the dataset of nitocris/ニトクリス/尼托克丽丝 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `purple_hair, dark-skinned_female, dark_skin, long_hair, animal_ears, facial_mark, purple_eyes, jackal_ears, very_long_hair, breasts, earrings, hairband, bangs, hoop_earrings, sidelocks, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 826.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nitocris_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 433.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nitocris_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1257 | 932.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nitocris_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 713.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nitocris_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1257 | 1.34 GiB | [Download](https://huggingface.co/datasets/CyberHarem/nitocris_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/nitocris_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, egyptian_clothes, solo, looking_at_viewer, bracelet, holding_staff, navel, smile, simple_background, usekh_collar, closed_mouth, pelvic_curtain | | 1 | 13 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, white_bikini, tiara, bare_shoulders, looking_at_viewer, smile, cleavage, ponytail, blush, facepaint, navel, sarong, armlet, closed_mouth, necklace, large_breasts, simple_background, white_background | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, jewelry, looking_at_viewer, solo, white_one-piece_swimsuit, ass, bare_shoulders, facepaint, simple_background, thighs, blush, large_breasts, low-tied_long_hair, white_background, closed_mouth, hair_tubes, looking_back, open_mouth | | 3 | 16 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, solo, necklace, smile, blush, white_one-piece_swimsuit, collarbone, open_mouth, large_breasts, covered_navel | | 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, day, solo, white_one-piece_swimsuit, looking_at_viewer, blue_sky, cloud, outdoors, beach, smile, covered_navel, large_breasts, necklace, blush, low-tied_long_hair, thighs | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, blush, facepaint, hetero, jewelry, large_breasts, nipples, nude, open_mouth, thighs, collarbone, cum_in_pussy, mosaic_censoring, vaginal, hair_tubes, low-tied_long_hair, navel, sex_from_behind, solo_focus, sweat, bent_over, looking_at_viewer, tongue_out | | 6 | 10 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, smile, solo, looking_at_viewer, skirt, white_gloves, low-tied_long_hair, navel, red_necktie, elbow_gloves, hair_between_eyes, peaked_cap, midriff, belt, closed_mouth, facepaint, bracelet, fingerless_gloves, full_body, simple_background, white_background, white_headwear, white_thighhighs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | egyptian_clothes | solo | looking_at_viewer | bracelet | holding_staff | navel | smile | simple_background | usekh_collar | closed_mouth | pelvic_curtain | white_bikini | tiara | bare_shoulders | cleavage | ponytail | blush | facepaint | sarong | armlet | necklace | large_breasts | white_background | jewelry | white_one-piece_swimsuit | ass | thighs | low-tied_long_hair | hair_tubes | looking_back | open_mouth | collarbone | covered_navel | day | blue_sky | cloud | outdoors | beach | 1boy | hetero | nipples | nude | cum_in_pussy | mosaic_censoring | vaginal | sex_from_behind | solo_focus | sweat | bent_over | tongue_out | skirt | white_gloves | red_necktie | elbow_gloves | hair_between_eyes | peaked_cap | midriff | belt | fingerless_gloves | full_body | white_headwear | white_thighhighs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:-------|:--------------------|:-----------|:----------------|:--------|:--------|:--------------------|:---------------|:---------------|:-----------------|:---------------|:--------|:-----------------|:-----------|:-----------|:--------|:------------|:---------|:---------|:-----------|:----------------|:-------------------|:----------|:---------------------------|:------|:---------|:---------------------|:-------------|:---------------|:-------------|:-------------|:----------------|:------|:-----------|:--------|:-----------|:--------|:-------|:---------|:----------|:-------|:---------------|:-------------------|:----------|:------------------|:-------------|:--------|:------------|:-------------|:--------|:---------------|:--------------|:---------------|:--------------------|:-------------|:----------|:-------|:--------------------|:------------|:-----------------|:-------------------| | 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 13 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | | | | | X | | X | | | | X | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 16 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | | | | X | | | | | | | | | | X | | | | X | X | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | | | | X | | | | | | | | | | X | | | | X | X | | | X | | X | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | | | X | | | | | | | | | | | X | X | | | | X | | X | | | X | X | X | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | 6 | 10 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | X | X | | X | X | X | | X | | | | | | | | X | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/nitocris_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T01:48:09+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-11T22:26:19+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of nitocris/ニトクリス/尼托克丽丝 (Fate/Grand Order) ================================================== This is the dataset of nitocris/ニトクリス/尼托克丽丝 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'purple\_hair, dark-skinned\_female, dark\_skin, long\_hair, animal\_ears, facial\_mark, purple\_eyes, jackal\_ears, very\_long\_hair, breasts, earrings, hairband, bangs, hoop\_earrings, sidelocks, medium\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
70538631943dc1517ae9bb0733251809f4445d0e
SafetyBench is a comprehensive benchmark for evaluating the safety of LLMs, which comprises 11,435 diverse multiple choice questions spanning across 7 distinct categories of safety concerns. Notably, SafetyBench also incorporates both Chinese and English data, facilitating the evaluation in both languages. Please visit our [GitHub](https://github.com/thu-coai/SafetyBench) and [website](https://llmbench.ai/safety) or check our [paper](https://arxiv.org/abs/2309.07045) for more details. We release three differents test sets including Chinese testset (`test_zh.json`), English testset (`test_en.json`) and Chinese subset testset (`test_zh_subset.json`). The Chinese subset testset removes samples with highly sensitive keywords to test API-based Chinese LLMs with strict filtering mechanism. We also release 5 examples for each of the 7 safety categories in both Chinese (`dev_zh.json`) and English (`dev_en.json`), to enable few-shot evaluation. Labels on the test sets are not released and users are required to submit their results to automatically obtain test accuracy. [How to submit?](https://github.com/thu-coai/SafetyBench#how-to-submit) ## Load the Data ```python from datasets import load_dataset test_dataset = load_dataset("thu-coai/SafetyBench", "test") dev_dataset = load_dataset("thu-coai/SafetyBench", "dev") ``` More details about loading the data and evaluating LLMs could be found at our [github repository](https://github.com/thu-coai/SafetyBench). Please cite our paper if you use our dataset. ``` @article{zhang2023safetybench, title={SafetyBench: Evaluating the Safety of Large Language Models with Multiple Choice Questions}, author={Zhexin Zhang and Leqi Lei and Lindong Wu and Rui Sun and Yongkang Huang and Chong Long and Xiao Liu and Xuanyu Lei and Jie Tang and Minlie Huang}, journal={arXiv preprint arXiv:2309.07045}, year={2023} } ```
thu-coai/SafetyBench
[ "license:mit", "arxiv:2309.07045", "region:us" ]
2023-09-12T01:59:06+00:00
{"license": "mit", "configs": [{"config_name": "test", "data_files": [{"split": "zh", "path": "test_zh.json"}, {"split": "en", "path": "test_en.json"}, {"split": "zh_subset", "path": "test_zh_subset.json"}]}, {"config_name": "dev", "data_files": [{"split": "zh", "path": "dev_zh.json"}, {"split": "en", "path": "dev_en.json"}]}]}
2023-09-14T04:25:39+00:00
[ "2309.07045" ]
[]
TAGS #license-mit #arxiv-2309.07045 #region-us
SafetyBench is a comprehensive benchmark for evaluating the safety of LLMs, which comprises 11,435 diverse multiple choice questions spanning across 7 distinct categories of safety concerns. Notably, SafetyBench also incorporates both Chinese and English data, facilitating the evaluation in both languages. Please visit our GitHub and website or check our paper for more details. We release three differents test sets including Chinese testset ('test_zh.json'), English testset ('test_en.json') and Chinese subset testset ('test_zh_subset.json'). The Chinese subset testset removes samples with highly sensitive keywords to test API-based Chinese LLMs with strict filtering mechanism. We also release 5 examples for each of the 7 safety categories in both Chinese ('dev_zh.json') and English ('dev_en.json'), to enable few-shot evaluation. Labels on the test sets are not released and users are required to submit their results to automatically obtain test accuracy. How to submit? ## Load the Data More details about loading the data and evaluating LLMs could be found at our github repository. Please cite our paper if you use our dataset.
[ "## Load the Data\n\n\n\nMore details about loading the data and evaluating LLMs could be found at our github repository.\n\nPlease cite our paper if you use our dataset." ]
[ "TAGS\n#license-mit #arxiv-2309.07045 #region-us \n", "## Load the Data\n\n\n\nMore details about loading the data and evaluating LLMs could be found at our github repository.\n\nPlease cite our paper if you use our dataset." ]
[ 19, 40 ]
[ "passage: TAGS\n#license-mit #arxiv-2309.07045 #region-us \n## Load the Data\n\n\n\nMore details about loading the data and evaluating LLMs could be found at our github repository.\n\nPlease cite our paper if you use our dataset." ]
74ca5f9774ec38460a321511896e321e4fd08dd0
# Dataset of lutia/ルチア (Pokémon) This is the dataset of lutia/ルチア (Pokémon), containing 377 images and their tags. The core tags of this character are `green_hair, hair_ornament, green_eyes, long_hair, earrings, sidelocks, eyelashes, bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 377 | 421.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 377 | 264.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 848 | 527.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 377 | 385.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 848 | 709.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/lutia_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, aqua_hair, looking_at_viewer, single_thighhigh, smile, choker, jewelry, midriff, open_mouth, overskirt, aqua_eyes, arm_warmers, navel, asymmetrical_hair, idol, solo, striped_thighhighs, blush, pokemon_(creature), short_shorts, simple_background, nail_polish | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, arm_warmers, jewelry, looking_at_viewer, open_mouth, overskirt, smile, tongue, single_thighhigh, ;d, navel, one_eye_closed, shorts_under_skirt, midriff, choker, arm_up, blue_footwear, blush, sparkle, upper_teeth_only, boots, pokemon_(creature), striped_thighhighs, solo | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, detached_sleeves, looking_at_viewer, official_alternate_costume, open_mouth, tongue, pokemon_(creature), sash, :d, ;d, blue_kimono, hand_up, one_eye_closed | | 3 | 10 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, hetero, blush, navel, nipples, open_mouth, solo_focus, vaginal, choker, cum_in_pussy, one_eye_closed, aqua_eyes, aqua_hair, jewelry, 1boy, large_breasts, multiple_penises, smile, spread_legs, sweat, 3boys, arm_warmers, asymmetrical_hair, gangbang, handjob, pubic_hair, single_thighhigh, uncensored | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, arms_behind_back, ball_gag, breasts, full_body, gagged, solo, jewelry, looking_at_viewer, navel, barefoot, asymmetrical_hair, bikini, blue_footwear, blush, crotch_rope, knees, panties, shibari, standing | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | aqua_hair | looking_at_viewer | single_thighhigh | smile | choker | jewelry | midriff | open_mouth | overskirt | aqua_eyes | arm_warmers | navel | asymmetrical_hair | idol | solo | striped_thighhighs | blush | pokemon_(creature) | short_shorts | simple_background | nail_polish | tongue | ;d | one_eye_closed | shorts_under_skirt | arm_up | blue_footwear | sparkle | upper_teeth_only | boots | detached_sleeves | official_alternate_costume | sash | :d | blue_kimono | hand_up | hetero | nipples | solo_focus | vaginal | cum_in_pussy | 1boy | large_breasts | multiple_penises | spread_legs | sweat | 3boys | gangbang | handjob | pubic_hair | uncensored | arms_behind_back | ball_gag | breasts | full_body | gagged | barefoot | bikini | crotch_rope | knees | panties | shibari | standing | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:--------------------|:-------------------|:--------|:---------|:----------|:----------|:-------------|:------------|:------------|:--------------|:--------|:--------------------|:-------|:-------|:---------------------|:--------|:---------------------|:---------------|:--------------------|:--------------|:---------|:-----|:-----------------|:---------------------|:---------|:----------------|:----------|:-------------------|:--------|:-------------------|:-----------------------------|:-------|:-----|:--------------|:----------|:---------|:----------|:-------------|:----------|:---------------|:-------|:----------------|:-------------------|:--------------|:--------|:--------|:-----------|:----------|:-------------|:-------------|:-------------------|:-----------|:----------|:------------|:---------|:-----------|:---------|:--------------|:--------|:----------|:----------|:-----------| | 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | X | X | X | X | X | X | | X | X | | | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | | | | | X | | | | | | | | | | X | | | | X | X | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 10 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | X | X | X | | X | | X | X | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | | | X | | | | | | X | X | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/lutia_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T02:03:28+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T20:37:33+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of lutia/ルチア (Pokémon) ============================== This is the dataset of lutia/ルチア (Pokémon), containing 377 images and their tags. The core tags of this character are 'green\_hair, hair\_ornament, green\_eyes, long\_hair, earrings, sidelocks, eyelashes, bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
8071cc256c724cffc170e870ed71ac483ec4042f
# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-stf4 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Mikivis/gpt2-large-lora-stf4 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-stf4](https://huggingface.co/Mikivis/gpt2-large-lora-stf4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T23:48:52.785657](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4/blob/main/results_2023-10-27T23-48-52.785657.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.003460570469798658, "em_stderr": 0.0006013962884271089, "f1": 0.07443372483221503, "f1_stderr": 0.0016782330994195233, "acc": 0.26795580110497236, "acc_stderr": 0.007008096716979156 }, "harness|drop|3": { "em": 0.003460570469798658, "em_stderr": 0.0006013962884271089, "f1": 0.07443372483221503, "f1_stderr": 0.0016782330994195233 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5359116022099447, "acc_stderr": 0.014016193433958312 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4
[ "region:us" ]
2023-09-12T02:05:17+00:00
{"pretty_name": "Evaluation run of Mikivis/gpt2-large-lora-stf4", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-stf4](https://huggingface.co/Mikivis/gpt2-large-lora-stf4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T23:48:52.785657](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-stf4/blob/main/results_2023-10-27T23-48-52.785657.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003460570469798658,\n \"em_stderr\": 0.0006013962884271089,\n \"f1\": 0.07443372483221503,\n \"f1_stderr\": 0.0016782330994195233,\n \"acc\": 0.26795580110497236,\n \"acc_stderr\": 0.007008096716979156\n },\n \"harness|drop|3\": {\n \"em\": 0.003460570469798658,\n \"em_stderr\": 0.0006013962884271089,\n \"f1\": 0.07443372483221503,\n \"f1_stderr\": 0.0016782330994195233\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5359116022099447,\n \"acc_stderr\": 0.014016193433958312\n }\n}\n```", "repo_url": "https://huggingface.co/Mikivis/gpt2-large-lora-stf4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|arc:challenge|25_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T23_48_52.785657", "path": ["**/details_harness|drop|3_2023-10-27T23-48-52.785657.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T23-48-52.785657.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T23_48_52.785657", "path": ["**/details_harness|gsm8k|5_2023-10-27T23-48-52.785657.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T23-48-52.785657.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hellaswag|10_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T03-05-07.244584.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T03-05-07.244584.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T23_48_52.785657", "path": ["**/details_harness|winogrande|5_2023-10-27T23-48-52.785657.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T23-48-52.785657.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T03_05_07.244584", "path": ["results_2023-09-12T03-05-07.244584.parquet"]}, {"split": "2023_10_27T23_48_52.785657", "path": ["results_2023-10-27T23-48-52.785657.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T23-48-52.785657.parquet"]}]}]}
2023-10-27T22:49:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-stf4 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Mikivis/gpt2-large-lora-stf4 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T23:48:52.785657(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-stf4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikivis/gpt2-large-lora-stf4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T23:48:52.785657(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-stf4", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikivis/gpt2-large-lora-stf4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T23:48:52.785657(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-stf4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikivis/gpt2-large-lora-stf4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T23:48:52.785657(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c67de644e5cdeeecb3fdcb48be3037c5cf09fe53
这是一个中文H小说数据集,收集自sis001 sis-novel1为中短篇小说,112182项,解压缩后大小5.7GB,数据截止2022年7月 sis-novel2为长篇小说,4555项,解压缩后大小3.6GB,数据截止2023年3月 数据均为未清洗的txt版本,并且可能包含有评论
a686d380/sis-novel
[ "license:openrail", "region:us" ]
2023-09-12T02:14:04+00:00
{"license": "openrail", "viewer": false}
2023-09-12T02:59:29+00:00
[]
[]
TAGS #license-openrail #region-us
这是一个中文H小说数据集,收集自sis001 sis-novel1为中短篇小说,112182项,解压缩后大小5.7GB,数据截止2022年7月 sis-novel2为长篇小说,4555项,解压缩后大小3.6GB,数据截止2023年3月 数据均为未清洗的txt版本,并且可能包含有评论
[]
[ "TAGS\n#license-openrail #region-us \n" ]
[ 12 ]
[ "passage: TAGS\n#license-openrail #region-us \n" ]
ee76a808c221b312aaff98fa243c168e78a3e6be
# Dataset of karin (Pokémon) This is the dataset of karin (Pokémon), containing 57 images and their tags. The core tags of this character are `long_hair, breasts, blue_eyes, blue_hair, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 57 | 54.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karin_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 57 | 35.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karin_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 102 | 63.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karin_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 57 | 50.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karin_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 102 | 86.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/karin_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/karin_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, crop_top, midriff, navel, white_pants, bare_arms, looking_at_viewer, smile, bare_shoulders, eyelashes, solo, cleavage, collarbone, grey_hair, standing, yellow_shirt, capri_pants, full_body, hand_on_hip | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | crop_top | midriff | navel | white_pants | bare_arms | looking_at_viewer | smile | bare_shoulders | eyelashes | solo | cleavage | collarbone | grey_hair | standing | yellow_shirt | capri_pants | full_body | hand_on_hip | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------|:--------|:--------------|:------------|:--------------------|:--------|:-----------------|:------------|:-------|:-----------|:-------------|:------------|:-----------|:---------------|:--------------|:------------|:--------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/karin_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T02:16:13+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T19:35:36+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of karin (Pokémon) ========================== This is the dataset of karin (Pokémon), containing 57 images and their tags. The core tags of this character are 'long\_hair, breasts, blue\_eyes, blue\_hair, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
8aebd6d121d372e84f9067d0b43e3db2e41e3a6f
# Dataset Card for "c4_academic_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zxvix/c4_academic_2
[ "region:us" ]
2023-09-12T02:22:50+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "timestamp", "dtype": "timestamp[s]"}, {"name": "url", "dtype": "string"}, {"name": "original_text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 2911336.564, "num_examples": 986}], "download_size": 1841617, "dataset_size": 2911336.564}}
2023-09-12T03:10:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "c4_academic_2" More Information needed
[ "# Dataset Card for \"c4_academic_2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"c4_academic_2\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"c4_academic_2\"\n\nMore Information needed" ]
b64810cf5bdcc908cf9929a58fe062cafc147c71
# Dataset Card for "AR-dotless-mediumPlus" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dot-ammar/AR-dotless-mediumPlus
[ "region:us" ]
2023-09-12T02:23:04+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "clean", "dtype": "string"}, {"name": "dotless", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 782074235.6168703, "num_examples": 4446330}], "download_size": 446112756, "dataset_size": 782074235.6168703}}
2023-09-12T02:24:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for "AR-dotless-mediumPlus" More Information needed
[ "# Dataset Card for \"AR-dotless-mediumPlus\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"AR-dotless-mediumPlus\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"AR-dotless-mediumPlus\"\n\nMore Information needed" ]
4e9553323bd33f734bb80137fb04280e129eb0fd
# Dataset Card for "silverliningeda-dataset-test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
silverliningeda/silverliningeda-dataset-test
[ "region:us" ]
2023-09-12T02:44:04+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 163420, "num_examples": 500}], "download_size": 3073, "dataset_size": 163420}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-14T22:54:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for "silverliningeda-dataset-test" More Information needed
[ "# Dataset Card for \"silverliningeda-dataset-test\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"silverliningeda-dataset-test\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"silverliningeda-dataset-test\"\n\nMore Information needed" ]
59b0adfadc04c49fb81656fba1ea1f474ead651d
# Dataset of hassan_of_the_serenity/静謐のハサン/静谧哈桑 (Fate/Grand Order) This is the dataset of hassan_of_the_serenity/静謐のハサン/静谧哈桑 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `dark-skinned_female, dark_skin, purple_hair, purple_eyes, short_hair, breasts, bangs, hair_between_eyes, hairband, black_hairband, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 577.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hassan_of_the_serenity_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 310.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hassan_of_the_serenity_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1172 | 667.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hassan_of_the_serenity_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 498.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hassan_of_the_serenity_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1172 | 978.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hassan_of_the_serenity_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/hassan_of_the_serenity_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, black_gloves, looking_at_viewer, solo, fingerless_gloves, holding, center_opening, black_leotard, knife, leggings, kunai, mask, navel, white_background | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, black_gloves, center_opening, fingerless_gloves, holding_mask, looking_at_viewer, mask_removed, navel, solo, flower, simple_background, skull_mask, white_background, arm_behind_back, black_leotard, blush, cowboy_shot, leggings, parted_lips, small_breasts | | 2 | 13 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, center_opening, looking_at_viewer, navel, simple_background, solo, white_background, bare_shoulders, black_gloves, black_leotard, fingerless_gloves, blush, collarbone | | 3 | 10 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, ass, bare_shoulders, fingerless_gloves, looking_at_viewer, backless_outfit, black_gloves, looking_back, solo, bare_back, from_behind, simple_background, white_background, black_leotard, leggings, holding_weapon | | 4 | 19 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, bare_shoulders, sleeveless_dress, solo, white_dress, collarbone, looking_at_viewer, small_breasts, blush, simple_background, white_background, closed_mouth, sidelocks, smile, upper_body, bare_arms | | 5 | 18 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, blush, solo, closed_mouth, looking_at_viewer, smile, white_background, simple_background, long_sleeves, sleeves_past_wrists, hand_up, sidelocks, black_shirt, upper_body, collarbone | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, black_hoodie, blush, hood_down, looking_at_viewer, solo, puffy_long_sleeves, simple_background, sleeves_past_wrists, smile, white_background, drawstring, closed_mouth, sidelocks, :>, upper_body, v-shaped_eyebrows, very_dark_skin | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, bare_shoulders, detached_sleeves, looking_at_viewer, solo, very_dark_skin, apple, hair_flower, white_dress, detached_collar, blush, holding_fruit, pink_dress | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, long_sleeves, looking_at_viewer, pleated_skirt, school_uniform, solo, white_background, blush, collared_shirt, plaid_skirt, smile, white_shirt, alternate_costume, blazer, closed_mouth, open_clothes, simple_background, black_jacket, black_skirt, bowtie, sidelocks, sleeves_past_wrists, sweater | | 9 | 10 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, bare_shoulders, looking_at_viewer, solo, blush, bell, christmas, sheep_horns, smile, white_thighhighs, underboob, gift, navel, ribbon-trimmed_legwear, bare_arms, bow, white_panties | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | looking_at_viewer | solo | fingerless_gloves | holding | center_opening | black_leotard | knife | leggings | kunai | mask | navel | white_background | holding_mask | mask_removed | flower | simple_background | skull_mask | arm_behind_back | blush | cowboy_shot | parted_lips | small_breasts | collarbone | ass | backless_outfit | looking_back | bare_back | from_behind | holding_weapon | sleeveless_dress | white_dress | closed_mouth | sidelocks | smile | upper_body | bare_arms | long_sleeves | sleeves_past_wrists | hand_up | black_shirt | black_hoodie | hood_down | puffy_long_sleeves | drawstring | :> | v-shaped_eyebrows | very_dark_skin | detached_sleeves | apple | hair_flower | detached_collar | holding_fruit | pink_dress | pleated_skirt | school_uniform | collared_shirt | plaid_skirt | white_shirt | alternate_costume | blazer | open_clothes | black_jacket | black_skirt | bowtie | sweater | bell | christmas | sheep_horns | white_thighhighs | underboob | gift | ribbon-trimmed_legwear | bow | white_panties | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:--------------------|:-------|:--------------------|:----------|:-----------------|:----------------|:--------|:-----------|:--------|:-------|:--------|:-------------------|:---------------|:---------------|:---------|:--------------------|:-------------|:------------------|:--------|:--------------|:--------------|:----------------|:-------------|:------|:------------------|:---------------|:------------|:--------------|:-----------------|:-------------------|:--------------|:---------------|:------------|:--------|:-------------|:------------|:---------------|:----------------------|:----------|:--------------|:---------------|:------------|:---------------------|:-------------|:-----|:--------------------|:-----------------|:-------------------|:--------|:--------------|:------------------|:----------------|:-------------|:----------------|:-----------------|:-----------------|:--------------|:--------------|:--------------------|:---------|:---------------|:---------------|:--------------|:---------|:----------|:-------|:------------|:--------------|:-------------------|:------------|:-------|:-------------------------|:------|:----------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 13 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | | X | X | | | | | X | X | | | | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 10 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | X | X | | | X | | X | | | | X | | | | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 19 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | X | X | | | | | | | | | | X | | | | X | | | X | | | X | X | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 18 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | X | | | | | | | | | | X | | | | X | | | X | | | | X | | | | | | | | | X | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | X | | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | X | X | X | X | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | X | X | | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | X | X | X | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | 9 | 10 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | | X | X | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
CyberHarem/hassan_of_the_serenity_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T03:00:13+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-12T03:06:22+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of hassan\_of\_the\_serenity/静謐のハサン/静谧哈桑 (Fate/Grand Order) =================================================================== This is the dataset of hassan\_of\_the\_serenity/静謐のハサン/静谧哈桑 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'dark-skinned\_female, dark\_skin, purple\_hair, purple\_eyes, short\_hair, breasts, bangs, hair\_between\_eyes, hairband, black\_hairband, medium\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
02e7523044ab7157fa104b118bc8463783f8d231
# Dataset of anastasia (THE iDOLM@STER: Cinderella Girls) This is the dataset of anastasia (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are `short_hair, blue_eyes, grey_hair, breasts, hair_between_eyes, bangs, medium_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 671.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anastasia_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 398.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anastasia_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1222 | 852.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anastasia_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 595.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anastasia_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1222 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/anastasia_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/anastasia_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, hair_ornament, looking_at_viewer, solo, starry_sky, thighhighs, white_gloves, night_sky, open_mouth, :d, blush, dress, earrings, zettai_ryouiki | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blue_dress, hair_flower, looking_at_viewer, solo, white_gloves, bare_shoulders, sleeveless_dress, starry_sky, aurora, choker, cleavage, earrings, night_sky, smile, sparkle, armlet, cowboy_shot, high_heels, standing, strapless, white_flower | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, dress, solo, bare_shoulders, elbow_gloves, hat, looking_at_viewer, pantyhose, smile, necklace, simple_background, white_background | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, hat, looking_at_viewer, solo, cape, flower, holding_weapon, knee_boots, open_mouth, smile, necklace, thigh_strap, belt, detached_sleeves, elbow_gloves, full_body, rapier, sheath, short_dress, star_(symbol) | | 4 | 15 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, solo, looking_at_viewer, blush, smile, earrings, open_mouth | | 5 | 10 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, fur_trim, looking_at_viewer, necklace, solo, coat, parka, smile, black_shorts | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, looking_at_viewer, open_mouth, simple_background, solo, white_background, necklace, upper_body, :d, cleavage, collarbone, shiny_hair, sleeveless_dress | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, closed_mouth, collarbone, looking_at_viewer, smile, solo, blush, simple_background, upper_body, white_background, shirt, cleavage, sleeveless | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, looking_at_viewer, solo, cross_necklace, fur_trim, blush, coat, earrings, ribbed_sweater, jacket, upper_body, long_sleeves, simple_background, turtleneck_sweater | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, long_sleeves, solo, white_gloves, looking_at_viewer, smile, standing, white_skirt, zettai_ryouiki, blue_jacket, white_background, feather_hair_ornament, holding_microphone, shiny_hair, simple_background, blue_feathers, blue_necktie, blue_thighhighs, blush, closed_mouth, dress, short_necktie, white_feathers | | 10 | 9 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, bare_shoulders, blush, collarbone, day, looking_at_viewer, outdoors, solo, beach, blue_bikini, cloud, halterneck, ocean, skindentation, smile, string_bikini, wet, blue_sky, cleavage, large_breasts, water, earrings, upper_body, navel, open_mouth, bare_arms, stomach, sunlight | | 11 | 10 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, kimono, smile, solo, looking_at_viewer, open_mouth, floral_print, obi, hair_flower, wide_sleeves | | 12 | 10 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | 1girl, blush, nipples, solo, looking_at_viewer, navel, spread_legs, arms_behind_back, barefoot, bondage, bound_arms, completely_nude, rope, shibari, sweat, vibrator, collarbone, mosaic_censoring, open_mouth, pussy_juice, restrained, saliva, dildo, gagged, large_breasts, vaginal_object_insertion | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hair_ornament | looking_at_viewer | solo | starry_sky | thighhighs | white_gloves | night_sky | open_mouth | :d | blush | dress | earrings | zettai_ryouiki | blue_dress | hair_flower | bare_shoulders | sleeveless_dress | aurora | choker | cleavage | smile | sparkle | armlet | cowboy_shot | high_heels | standing | strapless | white_flower | elbow_gloves | hat | pantyhose | necklace | simple_background | white_background | cape | flower | holding_weapon | knee_boots | thigh_strap | belt | detached_sleeves | full_body | rapier | sheath | short_dress | star_(symbol) | fur_trim | coat | parka | black_shorts | upper_body | collarbone | shiny_hair | closed_mouth | shirt | sleeveless | cross_necklace | ribbed_sweater | jacket | long_sleeves | turtleneck_sweater | white_skirt | blue_jacket | feather_hair_ornament | holding_microphone | blue_feathers | blue_necktie | blue_thighhighs | short_necktie | white_feathers | day | outdoors | beach | blue_bikini | cloud | halterneck | ocean | skindentation | string_bikini | wet | blue_sky | large_breasts | water | navel | bare_arms | stomach | sunlight | kimono | floral_print | obi | wide_sleeves | nipples | spread_legs | arms_behind_back | barefoot | bondage | bound_arms | completely_nude | rope | shibari | sweat | vibrator | mosaic_censoring | pussy_juice | restrained | saliva | dildo | gagged | vaginal_object_insertion | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:----------------|:--------------------|:-------|:-------------|:-------------|:---------------|:------------|:-------------|:-----|:--------|:--------|:-----------|:-----------------|:-------------|:--------------|:-----------------|:-------------------|:---------|:---------|:-----------|:--------|:----------|:---------|:--------------|:-------------|:-----------|:------------|:---------------|:---------------|:------|:------------|:-----------|:--------------------|:-------------------|:-------|:---------|:-----------------|:-------------|:--------------|:-------|:-------------------|:------------|:---------|:---------|:--------------|:----------------|:-----------|:-------|:--------|:---------------|:-------------|:-------------|:-------------|:---------------|:--------|:-------------|:-----------------|:-----------------|:---------|:---------------|:---------------------|:--------------|:--------------|:------------------------|:---------------------|:----------------|:---------------|:------------------|:----------------|:-----------------|:------|:-----------|:--------|:--------------|:--------|:-------------|:--------|:----------------|:----------------|:------|:-----------|:----------------|:--------|:--------|:------------|:----------|:-----------|:---------|:---------------|:------|:---------------|:----------|:--------------|:-------------------|:-----------|:----------|:-------------|:------------------|:-------|:----------|:--------|:-----------|:-------------------|:--------------|:-------------|:---------|:--------|:---------|:---------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | X | | X | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | | | | | | | | X | | | | | X | | | | | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | | | | | X | | | | | | | | | | | | | X | | | | | | | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 15 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | | | | | X | | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 10 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | X | | | | | X | X | | | | | | | | X | | | X | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | X | | | | | | | X | | | | | | | | | | X | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 9 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | X | X | | | X | | | | X | X | | X | | | | | | | | X | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 9 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | X | X | | | | | X | | X | | X | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 11 | 10 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | X | X | | | | | X | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | 12 | 10 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | X | | X | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/anastasia_idolmastercinderellagirls
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T03:07:27+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T23:54:31+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of anastasia (THE iDOLM@STER: Cinderella Girls) ======================================================= This is the dataset of anastasia (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are 'short\_hair, blue\_eyes, grey\_hair, breasts, hair\_between\_eyes, bangs, medium\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
50b7433d2cabb22a177a31fc7233f00fee8c5338
# Dataset Card for "sv_corpora_parliament_processed" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
macarious/sv_corpora_parliament_processed
[ "region:us" ]
2023-09-12T03:11:36+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 292351437, "num_examples": 1892723}], "download_size": 0, "dataset_size": 292351437}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-15T17:12:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "sv_corpora_parliament_processed" More Information needed
[ "# Dataset Card for \"sv_corpora_parliament_processed\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"sv_corpora_parliament_processed\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"sv_corpora_parliament_processed\"\n\nMore Information needed" ]
dc7a5847e0794b71301aa2525a57a8104d5e3ce6
# Dataset Card for Evaluation run of Riiid/sheep-duck-llama-2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Riiid/sheep-duck-llama-2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Riiid/sheep-duck-llama-2](https://huggingface.co/Riiid/sheep-duck-llama-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Riiid__sheep-duck-llama-2", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-19T02:41:38.567550](https://huggingface.co/datasets/open-llm-leaderboard/details_Riiid__sheep-duck-llama-2/blob/main/results_2023-09-19T02-41-38.567550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7074787526637408, "acc_stderr": 0.030842770794867788, "acc_norm": 0.7112713043078007, "acc_norm_stderr": 0.03081173438001915, "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168103, "mc2": 0.6379733867215786, "mc2_stderr": 0.014804542452694204 }, "harness|arc:challenge|25": { "acc": 0.6851535836177475, "acc_stderr": 0.013572657703084948, "acc_norm": 0.7226962457337884, "acc_norm_stderr": 0.013082095839059376 }, "harness|hellaswag|10": { "acc": 0.6915952997410875, "acc_stderr": 0.0046089078729577085, "acc_norm": 0.8778131846245768, "acc_norm_stderr": 0.003268321260913631 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.04171654161354543, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.04171654161354543 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8026315789473685, "acc_stderr": 0.03238981601699397, "acc_norm": 0.8026315789473685, "acc_norm_stderr": 0.03238981601699397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.74, "acc_stderr": 0.04408440022768081, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768081 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7471698113207547, "acc_stderr": 0.02674989977124121, "acc_norm": 0.7471698113207547, "acc_norm_stderr": 0.02674989977124121 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8194444444444444, "acc_stderr": 0.03216600808802267, "acc_norm": 0.8194444444444444, "acc_norm_stderr": 0.03216600808802267 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6851063829787234, "acc_stderr": 0.03036358219723817, "acc_norm": 0.6851063829787234, "acc_norm_stderr": 0.03036358219723817 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.04692008381368909, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.04692008381368909 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6413793103448275, "acc_stderr": 0.03996629574876719, "acc_norm": 0.6413793103448275, "acc_norm_stderr": 0.03996629574876719 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4708994708994709, "acc_stderr": 0.02570765861415495, "acc_norm": 0.4708994708994709, "acc_norm_stderr": 0.02570765861415495 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5238095238095238, "acc_stderr": 0.04467062628403273, "acc_norm": 0.5238095238095238, "acc_norm_stderr": 0.04467062628403273 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8225806451612904, "acc_stderr": 0.021732540689329286, "acc_norm": 0.8225806451612904, "acc_norm_stderr": 0.021732540689329286 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5665024630541872, "acc_stderr": 0.034867317274198714, "acc_norm": 0.5665024630541872, "acc_norm_stderr": 0.034867317274198714 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.78, "acc_stderr": 0.04163331998932262, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932262 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8484848484848485, "acc_stderr": 0.027998073798781678, "acc_norm": 0.8484848484848485, "acc_norm_stderr": 0.027998073798781678 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8888888888888888, "acc_stderr": 0.022390787638216763, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.022390787638216763 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9378238341968912, "acc_stderr": 0.017426974154240528, "acc_norm": 0.9378238341968912, "acc_norm_stderr": 0.017426974154240528 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7102564102564103, "acc_stderr": 0.023000628243687968, "acc_norm": 0.7102564102564103, "acc_norm_stderr": 0.023000628243687968 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.0284934650910286, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.0284934650910286 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7773109243697479, "acc_stderr": 0.027025433498882385, "acc_norm": 0.7773109243697479, "acc_norm_stderr": 0.027025433498882385 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.47019867549668876, "acc_stderr": 0.040752249922169775, "acc_norm": 0.47019867549668876, "acc_norm_stderr": 0.040752249922169775 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.9009174311926605, "acc_stderr": 0.01280978008187893, "acc_norm": 0.9009174311926605, "acc_norm_stderr": 0.01280978008187893 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5972222222222222, "acc_stderr": 0.03344887382997865, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.03344887382997865 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.9313725490196079, "acc_stderr": 0.017744453647073315, "acc_norm": 0.9313725490196079, "acc_norm_stderr": 0.017744453647073315 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.9029535864978903, "acc_stderr": 0.019269323025640255, "acc_norm": 0.9029535864978903, "acc_norm_stderr": 0.019269323025640255 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7982062780269058, "acc_stderr": 0.026936111912802273, "acc_norm": 0.7982062780269058, "acc_norm_stderr": 0.026936111912802273 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8549618320610687, "acc_stderr": 0.030884661089515368, "acc_norm": 0.8549618320610687, "acc_norm_stderr": 0.030884661089515368 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8760330578512396, "acc_stderr": 0.030083098716035206, "acc_norm": 0.8760330578512396, "acc_norm_stderr": 0.030083098716035206 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.03602814176392645, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.03602814176392645 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8282208588957055, "acc_stderr": 0.029634717272371037, "acc_norm": 0.8282208588957055, "acc_norm_stderr": 0.029634717272371037 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5446428571428571, "acc_stderr": 0.04726835553719098, "acc_norm": 0.5446428571428571, "acc_norm_stderr": 0.04726835553719098 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9145299145299145, "acc_stderr": 0.01831589168562585, "acc_norm": 0.9145299145299145, "acc_norm_stderr": 0.01831589168562585 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8697318007662835, "acc_stderr": 0.012036729568216055, "acc_norm": 0.8697318007662835, "acc_norm_stderr": 0.012036729568216055 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7832369942196532, "acc_stderr": 0.022183477668412856, "acc_norm": 0.7832369942196532, "acc_norm_stderr": 0.022183477668412856 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.6245810055865921, "acc_stderr": 0.01619510424846353, "acc_norm": 0.6245810055865921, "acc_norm_stderr": 0.01619510424846353 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7549019607843137, "acc_stderr": 0.02463004897982477, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.02463004897982477 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7909967845659164, "acc_stderr": 0.02309314039837422, "acc_norm": 0.7909967845659164, "acc_norm_stderr": 0.02309314039837422 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8425925925925926, "acc_stderr": 0.020263764996385717, "acc_norm": 0.8425925925925926, "acc_norm_stderr": 0.020263764996385717 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5780141843971631, "acc_stderr": 0.029462189233370593, "acc_norm": 0.5780141843971631, "acc_norm_stderr": 0.029462189233370593 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5867014341590613, "acc_stderr": 0.012576779494860076, "acc_norm": 0.5867014341590613, "acc_norm_stderr": 0.012576779494860076 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7389705882352942, "acc_stderr": 0.02667925227010314, "acc_norm": 0.7389705882352942, "acc_norm_stderr": 0.02667925227010314 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7696078431372549, "acc_stderr": 0.01703522925803403, "acc_norm": 0.7696078431372549, "acc_norm_stderr": 0.01703522925803403 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7545454545454545, "acc_stderr": 0.041220665028782855, "acc_norm": 0.7545454545454545, "acc_norm_stderr": 0.041220665028782855 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7959183673469388, "acc_stderr": 0.025801283475090496, "acc_norm": 0.7959183673469388, "acc_norm_stderr": 0.025801283475090496 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8706467661691543, "acc_stderr": 0.023729830881018533, "acc_norm": 0.8706467661691543, "acc_norm_stderr": 0.023729830881018533 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8771929824561403, "acc_stderr": 0.02517298435015575, "acc_norm": 0.8771929824561403, "acc_norm_stderr": 0.02517298435015575 }, "harness|truthfulqa:mc|0": { "mc1": 0.4663402692778458, "mc1_stderr": 0.017463793867168103, "mc2": 0.6379733867215786, "mc2_stderr": 0.014804542452694204 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Riiid__sheep-duck-llama-2
[ "region:us" ]
2023-09-12T03:15:36+00:00
{"pretty_name": "Evaluation run of Riiid/sheep-duck-llama-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Riiid/sheep-duck-llama-2](https://huggingface.co/Riiid/sheep-duck-llama-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Riiid__sheep-duck-llama-2\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-19T02:41:38.567550](https://huggingface.co/datasets/open-llm-leaderboard/details_Riiid__sheep-duck-llama-2/blob/main/results_2023-09-19T02-41-38.567550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7074787526637408,\n \"acc_stderr\": 0.030842770794867788,\n \"acc_norm\": 0.7112713043078007,\n \"acc_norm_stderr\": 0.03081173438001915,\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6379733867215786,\n \"mc2_stderr\": 0.014804542452694204\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.013572657703084948,\n \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6915952997410875,\n \"acc_stderr\": 0.0046089078729577085,\n \"acc_norm\": 0.8778131846245768,\n \"acc_norm_stderr\": 0.003268321260913631\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.02674989977124121,\n \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.02674989977124121\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.03216600808802267,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.03216600808802267\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.03036358219723817,\n \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.03036358219723817\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4708994708994709,\n \"acc_stderr\": 0.02570765861415495,\n \"acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.02570765861415495\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5665024630541872,\n \"acc_stderr\": 0.034867317274198714,\n \"acc_norm\": 0.5665024630541872,\n \"acc_norm_stderr\": 0.034867317274198714\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216763,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216763\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687968,\n \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687968\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.0284934650910286,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.0284934650910286\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.027025433498882385,\n \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.027025433498882385\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9009174311926605,\n \"acc_stderr\": 0.01280978008187893,\n \"acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.01280978008187893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640255,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640255\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515368,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515368\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371037,\n \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371037\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n \"acc_stderr\": 0.012036729568216055,\n \"acc_norm\": 0.8697318007662835,\n \"acc_norm_stderr\": 0.012036729568216055\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6245810055865921,\n \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.6245810055865921,\n \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982477,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982477\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7909967845659164,\n \"acc_stderr\": 0.02309314039837422,\n \"acc_norm\": 0.7909967845659164,\n \"acc_norm_stderr\": 0.02309314039837422\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385717,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385717\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5780141843971631,\n \"acc_stderr\": 0.029462189233370593,\n \"acc_norm\": 0.5780141843971631,\n \"acc_norm_stderr\": 0.029462189233370593\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5867014341590613,\n \"acc_stderr\": 0.012576779494860076,\n \"acc_norm\": 0.5867014341590613,\n \"acc_norm_stderr\": 0.012576779494860076\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803403,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803403\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.025801283475090496,\n \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.025801283475090496\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018533,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018533\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015575,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015575\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6379733867215786,\n \"mc2_stderr\": 0.014804542452694204\n }\n}\n```", "repo_url": "https://huggingface.co/Riiid/sheep-duck-llama-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|arc:challenge|25_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|arc:challenge|25_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hellaswag|10_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hellaswag|10_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T04-15-20.917267.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-19T02-41-38.567550.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-19T02-41-38.567550.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T04_15_20.917267", "path": ["results_2023-09-12T04-15-20.917267.parquet"]}, {"split": "2023_09_19T02_41_38.567550", "path": ["results_2023-09-19T02-41-38.567550.parquet"]}, {"split": "latest", "path": ["results_2023-09-19T02-41-38.567550.parquet"]}]}]}
2023-09-19T01:43:00+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Riiid/sheep-duck-llama-2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Riiid/sheep-duck-llama-2 on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-19T02:41:38.567550(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Riiid/sheep-duck-llama-2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Riiid/sheep-duck-llama-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-19T02:41:38.567550(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Riiid/sheep-duck-llama-2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Riiid/sheep-duck-llama-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-19T02:41:38.567550(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 21, 31, 169, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Riiid/sheep-duck-llama-2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Riiid/sheep-duck-llama-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-19T02:41:38.567550(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
22f5f1a5cc9c2dd77bc8bcb5d96e60c60ab71830
# Few-shot Translation ## Install To install `lm-eval` from the github repository main branch, run: ```bash git clone https://github.com/hieunguyen1053/lm-evaluation-harness cd lm-evaluation-harness pip install -e . ``` ## Basic Usage > **Note**: When reporting results from eval harness, please include the task versions (shown in `results["versions"]`) for reproducibility. This allows bug fixes to tasks while also ensuring that previously reported scores are reproducible. See the [Task Versioning](#task-versioning) section for more info. ### Hugging Face `transformers` To evaluate a model hosted on the [HuggingFace Hub](https://huggingface.co/models) (e.g. vlsp-2023-vllm/hoa-1b4) on `hellaswag_vi` you can use the following command: ```bash python main.py \ --model hf-causal \ --model_args pretrained=vlsp-2023-vllm/hoa-1b4 \ --tasks translation_vi \ --batch_size auto \ --device cuda:0 ``` Additional arguments can be provided to the model constructor using the `--model_args` flag. Most notably, this supports the common practice of using the `revisions` feature on the Hub to store partially trained checkpoints, or to specify the datatype for running a model: ```bash python main.py \ --model hf-causal \ --model_args pretrained=vlsp-2023-vllm/hoa-1b4,revision=step100000,dtype="float" \ --tasks translation_vi \ --device cuda:0 ``` To evaluate models that are loaded via `AutoSeq2SeqLM` in Huggingface, you instead use `hf-seq2seq`. *To evaluate (causal) models across multiple GPUs, use `--model hf-causal-experimental`* > **Warning**: Choosing the wrong model may result in erroneous outputs despite not erroring.
vlsp-2023-vllm/en-to-vi-formal-informal-tranlations
[ "region:us" ]
2023-09-12T03:17:33+00:00
{"dataset_info": {"features": [{"name": "en", "dtype": "string"}, {"name": "vi", "dtype": "string"}, {"name": "fewshot_samples", "list": [{"name": "en", "dtype": "string"}, {"name": "vi", "dtype": "string"}]}], "splits": [{"name": "val", "num_bytes": 178154, "num_examples": 160}, {"name": "test", "num_bytes": 175339, "num_examples": 160}], "download_size": 124988, "dataset_size": 353493}}
2023-10-30T02:59:09+00:00
[]
[]
TAGS #region-us
# Few-shot Translation ## Install To install 'lm-eval' from the github repository main branch, run: ## Basic Usage > Note: When reporting results from eval harness, please include the task versions (shown in 'results["versions"]') for reproducibility. This allows bug fixes to tasks while also ensuring that previously reported scores are reproducible. See the Task Versioning section for more info. ### Hugging Face 'transformers' To evaluate a model hosted on the HuggingFace Hub (e.g. vlsp-2023-vllm/hoa-1b4) on 'hellaswag_vi' you can use the following command: Additional arguments can be provided to the model constructor using the '--model_args' flag. Most notably, this supports the common practice of using the 'revisions' feature on the Hub to store partially trained checkpoints, or to specify the datatype for running a model: To evaluate models that are loaded via 'AutoSeq2SeqLM' in Huggingface, you instead use 'hf-seq2seq'. *To evaluate (causal) models across multiple GPUs, use '--model hf-causal-experimental'* > Warning: Choosing the wrong model may result in erroneous outputs despite not erroring.
[ "# Few-shot Translation", "## Install\n\nTo install 'lm-eval' from the github repository main branch, run:", "## Basic Usage\n\n> Note: When reporting results from eval harness, please include the task versions (shown in 'results[\"versions\"]') for reproducibility. This allows bug fixes to tasks while also ensuring that previously reported scores are reproducible. See the Task Versioning section for more info.", "### Hugging Face 'transformers'\n\nTo evaluate a model hosted on the HuggingFace Hub (e.g. vlsp-2023-vllm/hoa-1b4) on 'hellaswag_vi' you can use the following command:\n\n\n\n\nAdditional arguments can be provided to the model constructor using the '--model_args' flag. Most notably, this supports the common practice of using the 'revisions' feature on the Hub to store partially trained checkpoints, or to specify the datatype for running a model:\n\n\n\nTo evaluate models that are loaded via 'AutoSeq2SeqLM' in Huggingface, you instead use 'hf-seq2seq'. *To evaluate (causal) models across multiple GPUs, use '--model hf-causal-experimental'*\n\n> Warning: Choosing the wrong model may result in erroneous outputs despite not erroring." ]
[ "TAGS\n#region-us \n", "# Few-shot Translation", "## Install\n\nTo install 'lm-eval' from the github repository main branch, run:", "## Basic Usage\n\n> Note: When reporting results from eval harness, please include the task versions (shown in 'results[\"versions\"]') for reproducibility. This allows bug fixes to tasks while also ensuring that previously reported scores are reproducible. See the Task Versioning section for more info.", "### Hugging Face 'transformers'\n\nTo evaluate a model hosted on the HuggingFace Hub (e.g. vlsp-2023-vllm/hoa-1b4) on 'hellaswag_vi' you can use the following command:\n\n\n\n\nAdditional arguments can be provided to the model constructor using the '--model_args' flag. Most notably, this supports the common practice of using the 'revisions' feature on the Hub to store partially trained checkpoints, or to specify the datatype for running a model:\n\n\n\nTo evaluate models that are loaded via 'AutoSeq2SeqLM' in Huggingface, you instead use 'hf-seq2seq'. *To evaluate (causal) models across multiple GPUs, use '--model hf-causal-experimental'*\n\n> Warning: Choosing the wrong model may result in erroneous outputs despite not erroring." ]
[ 6, 6, 23, 77, 219 ]
[ "passage: TAGS\n#region-us \n# Few-shot Translation## Install\n\nTo install 'lm-eval' from the github repository main branch, run:## Basic Usage\n\n> Note: When reporting results from eval harness, please include the task versions (shown in 'results[\"versions\"]') for reproducibility. This allows bug fixes to tasks while also ensuring that previously reported scores are reproducible. See the Task Versioning section for more info.### Hugging Face 'transformers'\n\nTo evaluate a model hosted on the HuggingFace Hub (e.g. vlsp-2023-vllm/hoa-1b4) on 'hellaswag_vi' you can use the following command:\n\n\n\n\nAdditional arguments can be provided to the model constructor using the '--model_args' flag. Most notably, this supports the common practice of using the 'revisions' feature on the Hub to store partially trained checkpoints, or to specify the datatype for running a model:\n\n\n\nTo evaluate models that are loaded via 'AutoSeq2SeqLM' in Huggingface, you instead use 'hf-seq2seq'. *To evaluate (causal) models across multiple GPUs, use '--model hf-causal-experimental'*\n\n> Warning: Choosing the wrong model may result in erroneous outputs despite not erroring." ]
01e4fe9d71389661d5aa529d6e06e486bd3998df
# Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
avisheknayak/testad1
[ "task_categories:summarization", "size_categories:n<1K", "language:en", "region:us" ]
2023-09-12T03:19:09+00:00
{"language": ["en"], "size_categories": ["n<1K"], "task_categories": ["summarization"]}
2023-09-12T06:04:54+00:00
[]
[ "en" ]
TAGS #task_categories-summarization #size_categories-n<1K #language-English #region-us
# Dataset Card for Dataset Name ## Dataset Description - Homepage: - Repository: - Paper: - Leaderboard: - Point of Contact: ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Dataset Name", "## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#task_categories-summarization #size_categories-n<1K #language-English #region-us \n", "# Dataset Card for Dataset Name", "## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:", "### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 30, 8, 24, 32, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#task_categories-summarization #size_categories-n<1K #language-English #region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f361bf83375f7286c6a112d86e76953de56f043e
# Dataset Card for "xlsum_clm" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
khalidalt/xlsum_clm
[ "region:us" ]
2023-09-12T03:35:17+00:00
{"dataset_info": {"features": [{"name": "gem_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "target", "dtype": "string"}, {"name": "references", "list": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 217986489, "num_examples": 37519}], "download_size": 107517494, "dataset_size": 217986489}}
2023-09-12T03:36:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for "xlsum_clm" More Information needed
[ "# Dataset Card for \"xlsum_clm\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"xlsum_clm\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"xlsum_clm\"\n\nMore Information needed" ]
6de595ba707a5659c7a8df4e81f7250f6fd6061c
# Dataset of acerola/アセロラ (Pokémon) This is the dataset of acerola/アセロラ (Pokémon), containing 500 images and their tags. The core tags of this character are `purple_hair, hair_ornament, flipped_hair, short_hair, bangs, purple_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 476.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 295.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1073 | 591.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 432.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1073 | 807.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acerola_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/acerola_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, multicolored_dress, stitches, topknot, torn_dress, short_sleeves, armlet, open_mouth, hairclip, collarbone, tongue, medium_hair, :d, pokemon_(creature), eyelashes, looking_at_viewer, grey_dress, blush, :3, grey_eyes, hands_up | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, :3, armlet, closed_mouth, flip-flops, grey_dress, hairclip, medium_hair, multicolored_dress, short_sleeves, smile, solo, stitches, toes, topknot, collarbone, full_body, grey_eyes, standing, torn_dress, looking_at_viewer, white_background, blue_dress, eyelashes, simple_background | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, armlet, looking_at_viewer, short_sleeves, topknot, :3, closed_mouth, multicolored_dress, simple_background, solo, stitches, blush, smile, torn_dress, white_background, collarbone, grey_dress, hairclip | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, :3, sandals, short_sleeves, smile, stitches, armlet, full_body, topknot, torn_dress, blush, collarbone, simple_background, standing, white_background, open_mouth, pokemon_(creature), solo | | 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blush, navel, looking_at_viewer, official_alternate_costume, open_mouth, hair_flower, topknot, eyelashes, necklace, tongue, floral_print, solo, :d, collarbone, medium_hair, bikini, day, bracelet, outdoors, pokemon_(creature), sarong | | 5 | 35 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | hood_up, official_alternate_costume, 1girl, eyelashes, tongue, open_mouth, single_glove, bead_bracelet, hooded_capelet, black_gloves, pantyhose, hands_up, blush, looking_at_viewer, medium_hair, pokemon_(creature), orange_shorts, striped_shorts, themed_object, cosplay, shoes, halloween, :d, vertical_stripes, solo | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, 1girl, hetero, open_mouth, blush, navel, nipples, sex, solo_focus, vaginal, nude, spread_legs, cum_in_pussy, bar_censor, collarbone, tongue, topknot, medium_breasts, on_back, veiny_penis | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | multicolored_dress | stitches | topknot | torn_dress | short_sleeves | armlet | open_mouth | hairclip | collarbone | tongue | medium_hair | :d | pokemon_(creature) | eyelashes | looking_at_viewer | grey_dress | blush | :3 | grey_eyes | hands_up | closed_mouth | flip-flops | smile | solo | toes | full_body | standing | white_background | blue_dress | simple_background | sandals | navel | official_alternate_costume | hair_flower | necklace | floral_print | bikini | day | bracelet | outdoors | sarong | hood_up | single_glove | bead_bracelet | hooded_capelet | black_gloves | pantyhose | orange_shorts | striped_shorts | themed_object | cosplay | shoes | halloween | vertical_stripes | 1boy | hetero | nipples | sex | solo_focus | vaginal | nude | spread_legs | cum_in_pussy | bar_censor | medium_breasts | on_back | veiny_penis | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:-----------|:----------|:-------------|:----------------|:---------|:-------------|:-----------|:-------------|:---------|:--------------|:-----|:---------------------|:------------|:--------------------|:-------------|:--------|:-----|:------------|:-----------|:---------------|:-------------|:--------|:-------|:-------|:------------|:-----------|:-------------------|:-------------|:--------------------|:----------|:--------|:-----------------------------|:--------------|:-----------|:---------------|:---------|:------|:-----------|:-----------|:---------|:----------|:---------------|:----------------|:-----------------|:---------------|:------------|:----------------|:-----------------|:----------------|:----------|:--------|:------------|:-------------------|:-------|:---------|:----------|:------|:-------------|:----------|:-------|:--------------|:---------------|:-------------|:-----------------|:----------|:--------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | | X | X | | X | | | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | | X | X | | | | | | X | X | X | X | | | X | | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | X | X | X | X | | X | | | | X | | | | X | X | | | | | X | X | | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 11 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | | | X | | X | X | X | X | X | X | X | | X | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 35 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | | | | | X | | | X | X | X | X | X | X | | X | | | X | | | | X | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | | | | X | | X | X | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/acerola_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T03:45:08+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T21:10:03+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of acerola/アセロラ (Pokémon) ================================= This is the dataset of acerola/アセロラ (Pokémon), containing 500 images and their tags. The core tags of this character are 'purple\_hair, hair\_ornament, flipped\_hair, short\_hair, bangs, purple\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
55ea0aaec63dc9e744f6ac36053079f845b94790
# Dataset Card for Evaluation run of behnamsh/gpt2_platypus-camel_physics ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/behnamsh/gpt2_platypus-camel_physics - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [behnamsh/gpt2_platypus-camel_physics](https://huggingface.co/behnamsh/gpt2_platypus-camel_physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-28T03:21:29.726636](https://huggingface.co/datasets/open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics/blob/main/results_2023-10-28T03-21-29.726636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0032508389261744967, "em_stderr": 0.0005829486708558949, "f1": 0.048523489932885924, "f1_stderr": 0.0013787561628243192, "acc": 0.25769534333070243, "acc_stderr": 0.0070229133948918315 }, "harness|drop|3": { "em": 0.0032508389261744967, "em_stderr": 0.0005829486708558949, "f1": 0.048523489932885924, "f1_stderr": 0.0013787561628243192 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5153906866614049, "acc_stderr": 0.014045826789783663 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics
[ "region:us" ]
2023-09-12T03:47:42+00:00
{"pretty_name": "Evaluation run of behnamsh/gpt2_platypus-camel_physics", "dataset_summary": "Dataset automatically created during the evaluation run of model [behnamsh/gpt2_platypus-camel_physics](https://huggingface.co/behnamsh/gpt2_platypus-camel_physics) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T03:21:29.726636](https://huggingface.co/datasets/open-llm-leaderboard/details_behnamsh__gpt2_platypus-camel_physics/blob/main/results_2023-10-28T03-21-29.726636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0032508389261744967,\n \"em_stderr\": 0.0005829486708558949,\n \"f1\": 0.048523489932885924,\n \"f1_stderr\": 0.0013787561628243192,\n \"acc\": 0.25769534333070243,\n \"acc_stderr\": 0.0070229133948918315\n },\n \"harness|drop|3\": {\n \"em\": 0.0032508389261744967,\n \"em_stderr\": 0.0005829486708558949,\n \"f1\": 0.048523489932885924,\n \"f1_stderr\": 0.0013787561628243192\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5153906866614049,\n \"acc_stderr\": 0.014045826789783663\n }\n}\n```", "repo_url": "https://huggingface.co/behnamsh/gpt2_platypus-camel_physics", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|arc:challenge|25_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T03_21_29.726636", "path": ["**/details_harness|drop|3_2023-10-28T03-21-29.726636.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T03-21-29.726636.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T03_21_29.726636", "path": ["**/details_harness|gsm8k|5_2023-10-28T03-21-29.726636.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T03-21-29.726636.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hellaswag|10_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T04-47-32.538128.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T04-47-32.538128.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T03_21_29.726636", "path": ["**/details_harness|winogrande|5_2023-10-28T03-21-29.726636.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T03-21-29.726636.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T04_47_32.538128", "path": ["results_2023-09-12T04-47-32.538128.parquet"]}, {"split": "2023_10_28T03_21_29.726636", "path": ["results_2023-10-28T03-21-29.726636.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T03-21-29.726636.parquet"]}]}]}
2023-10-28T02:21:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of behnamsh/gpt2_platypus-camel_physics ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model behnamsh/gpt2_platypus-camel_physics on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-28T03:21:29.726636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of behnamsh/gpt2_platypus-camel_physics", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model behnamsh/gpt2_platypus-camel_physics on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T03:21:29.726636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of behnamsh/gpt2_platypus-camel_physics", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model behnamsh/gpt2_platypus-camel_physics on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-28T03:21:29.726636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of behnamsh/gpt2_platypus-camel_physics## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model behnamsh/gpt2_platypus-camel_physics on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T03:21:29.726636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
bf4ba51a99ece1dcf43e96d01a9e788f4262cdda
# Dataset Card for Evaluation run of lgaalves/llama-2-13b-chat-platypus ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/lgaalves/llama-2-13b-chat-platypus - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [lgaalves/llama-2-13b-chat-platypus](https://huggingface.co/lgaalves/llama-2-13b-chat-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T20:27:56.260953](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus/blob/main/results_2023-10-27T20-27-56.260953.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0035654362416107383, "em_stderr": 0.0006104082299890483, "f1": 0.06259542785234914, "f1_stderr": 0.001452272347431231, "acc": 0.44182080490769055, "acc_stderr": 0.010533564468131328 }, "harness|drop|3": { "em": 0.0035654362416107383, "em_stderr": 0.0006104082299890483, "f1": 0.06259542785234914, "f1_stderr": 0.001452272347431231 }, "harness|gsm8k|5": { "acc": 0.12357846853677028, "acc_stderr": 0.009065050306776914 }, "harness|winogrande|5": { "acc": 0.7600631412786109, "acc_stderr": 0.01200207862948574 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus
[ "region:us" ]
2023-09-12T03:55:12+00:00
{"pretty_name": "Evaluation run of lgaalves/llama-2-13b-chat-platypus", "dataset_summary": "Dataset automatically created during the evaluation run of model [lgaalves/llama-2-13b-chat-platypus](https://huggingface.co/lgaalves/llama-2-13b-chat-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T20:27:56.260953](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-13b-chat-platypus/blob/main/results_2023-10-27T20-27-56.260953.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0035654362416107383,\n \"em_stderr\": 0.0006104082299890483,\n \"f1\": 0.06259542785234914,\n \"f1_stderr\": 0.001452272347431231,\n \"acc\": 0.44182080490769055,\n \"acc_stderr\": 0.010533564468131328\n },\n \"harness|drop|3\": {\n \"em\": 0.0035654362416107383,\n \"em_stderr\": 0.0006104082299890483,\n \"f1\": 0.06259542785234914,\n \"f1_stderr\": 0.001452272347431231\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \"acc_stderr\": 0.009065050306776914\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.01200207862948574\n }\n}\n```", "repo_url": "https://huggingface.co/lgaalves/llama-2-13b-chat-platypus", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|arc:challenge|25_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T20_27_56.260953", "path": ["**/details_harness|drop|3_2023-10-27T20-27-56.260953.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T20-27-56.260953.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T20_27_56.260953", "path": ["**/details_harness|gsm8k|5_2023-10-27T20-27-56.260953.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T20-27-56.260953.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hellaswag|10_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T04-54-55.763898.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T04-54-55.763898.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T20_27_56.260953", "path": ["**/details_harness|winogrande|5_2023-10-27T20-27-56.260953.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T20-27-56.260953.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T04_54_55.763898", "path": ["results_2023-09-12T04-54-55.763898.parquet"]}, {"split": "2023_10_27T20_27_56.260953", "path": ["results_2023-10-27T20-27-56.260953.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T20-27-56.260953.parquet"]}]}]}
2023-10-27T19:28:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of lgaalves/llama-2-13b-chat-platypus ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model lgaalves/llama-2-13b-chat-platypus on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T20:27:56.260953(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of lgaalves/llama-2-13b-chat-platypus", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/llama-2-13b-chat-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T20:27:56.260953(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of lgaalves/llama-2-13b-chat-platypus", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/llama-2-13b-chat-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T20:27:56.260953(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 25, 31, 173, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lgaalves/llama-2-13b-chat-platypus## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lgaalves/llama-2-13b-chat-platypus on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T20:27:56.260953(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b739c38d31023eaa197b4c2760fcefa90c31f714
# IWSLT 2014 German-English Translation Dataset w/ further processing This dataset was built with the fairseq's processing script, which can be originally found [here](https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt14.sh) `iwslt14.tokenized.de-en.zip` and `binarized.zip` can be built by running: ``` git clone https://huggingface.co/datasets/shijli/iwslt14-deen cd iwslt14-deen/data bash prepare-iwslt14.sh ``` `binarized.dist.de-en.zip` is a distilled dataset generated by a transformer base model. It can be built by running: ``` bash prepare-iwslt14-distill.sh /path/to/fairseq/model source-lang target-lang ``` To build this dataset, you need to create `binarized.zip` first. Note that the distilled dataset only uses model-generated target sentences, which means that different translation directions result in different datasets. Therefore, you need to specify `source-lang` and `target-lang` explicitly. Also, you need to replace `/path/to/fairseq/model` with the path of your pretrained model.
shijli/iwslt14-deen
[ "region:us" ]
2023-09-12T04:24:16+00:00
{}
2023-09-27T06:26:53+00:00
[]
[]
TAGS #region-us
# IWSLT 2014 German-English Translation Dataset w/ further processing This dataset was built with the fairseq's processing script, which can be originally found here 'URL' and 'URL' can be built by running: 'URL' is a distilled dataset generated by a transformer base model. It can be built by running: To build this dataset, you need to create 'URL' first. Note that the distilled dataset only uses model-generated target sentences, which means that different translation directions result in different datasets. Therefore, you need to specify 'source-lang' and 'target-lang' explicitly. Also, you need to replace '/path/to/fairseq/model' with the path of your pretrained model.
[ "# IWSLT 2014 German-English Translation Dataset w/ further processing\n\nThis dataset was built with the fairseq's processing script, which can be originally\nfound here\n\n'URL' and 'URL' can be built by running:\n\n\n\n'URL' is a distilled dataset generated by a transformer base model. It can be built by running:\n\n\n\nTo build this dataset, you need to create 'URL' first. Note that the distilled dataset only uses model-generated\ntarget sentences, which means that different translation directions result in different datasets. Therefore, you need to\nspecify 'source-lang' and 'target-lang' explicitly. Also, you need to replace '/path/to/fairseq/model' with the path of\nyour pretrained model." ]
[ "TAGS\n#region-us \n", "# IWSLT 2014 German-English Translation Dataset w/ further processing\n\nThis dataset was built with the fairseq's processing script, which can be originally\nfound here\n\n'URL' and 'URL' can be built by running:\n\n\n\n'URL' is a distilled dataset generated by a transformer base model. It can be built by running:\n\n\n\nTo build this dataset, you need to create 'URL' first. Note that the distilled dataset only uses model-generated\ntarget sentences, which means that different translation directions result in different datasets. Therefore, you need to\nspecify 'source-lang' and 'target-lang' explicitly. Also, you need to replace '/path/to/fairseq/model' with the path of\nyour pretrained model." ]
[ 6, 176 ]
[ "passage: TAGS\n#region-us \n# IWSLT 2014 German-English Translation Dataset w/ further processing\n\nThis dataset was built with the fairseq's processing script, which can be originally\nfound here\n\n'URL' and 'URL' can be built by running:\n\n\n\n'URL' is a distilled dataset generated by a transformer base model. It can be built by running:\n\n\n\nTo build this dataset, you need to create 'URL' first. Note that the distilled dataset only uses model-generated\ntarget sentences, which means that different translation directions result in different datasets. Therefore, you need to\nspecify 'source-lang' and 'target-lang' explicitly. Also, you need to replace '/path/to/fairseq/model' with the path of\nyour pretrained model." ]
ec0687a54426999328f8e6b014e7868fdddbc314
# Dataset of illyasviel_von_einzbern/イリヤスフィール・フォン・アインツベルン/伊莉雅丝菲尔·冯·爱因兹贝伦 (Fate/Grand Order) This is the dataset of illyasviel_von_einzbern/イリヤスフィール・フォン・アインツベルン/伊莉雅丝菲尔·冯·爱因兹贝伦 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `long_hair, red_eyes, white_hair, bangs, hair_between_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 641.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illyasviel_von_einzbern_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 373.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illyasviel_von_einzbern_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1188 | 793.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illyasviel_von_einzbern_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 570.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/illyasviel_von_einzbern_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1188 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/illyasviel_von_einzbern_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/illyasviel_von_einzbern_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, navel, official_alternate_costume, solo, flat_chest, purple_bikini, open_mouth, smile, blush, grey_hair | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, long_sleeves, looking_at_viewer, white_skirt, ascot, open_mouth, :d, purple_shirt, floating_hair, standing, frilled_skirt | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, boots, frilled_skirt, long_sleeves, purple_footwear, white_skirt, full_body, looking_at_viewer, pleated_skirt, smile, solo, standing, ascot, closed_mouth, purple_shirt | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, :d, blush, long_sleeves, looking_at_viewer, open_mouth, papakha, purple_capelet, purple_headwear, solo, white_scarf, outdoors, purple_coat, snowing | | 4 | 15 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, purple_headwear, solo, papakha, white_scarf, coat, open_mouth, smile | | 5 | 11 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, coat, purple_headwear, solo, white_scarf, boots, fur_hat, purple_footwear, snowing | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, solo, upper_body, ascot, closed_mouth, simple_background, long_sleeves, looking_at_viewer, blush, white_background, purple_shirt, smile | | 7 | 10 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, black_skirt, solo, homurahara_academy_school_uniform, looking_at_viewer, blonde_hair, pleated_skirt, blush, long_sleeves, white_background, open_mouth, simple_background, white_socks, :d, kneehighs, red_bow, red_ribbon, brown_headwear, collarbone, full_body, hat, shoes, standing, white_sailor_collar, white_shirt | | 8 | 39 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, solo, magical_girl, elbow_gloves, looking_at_viewer, blonde_hair, feather_hair_ornament, white_gloves, thighhighs, blush, two_side_up, open_mouth, bare_shoulders, small_breasts, detached_sleeves, pink_footwear, thigh_boots, pink_dress, skirt, holding_wand, :d, white_background | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, black_gloves, blonde_hair, cat_ears, cat_tail, elbow_gloves, fake_animal_ears, hair_ribbon, jingle_bell, looking_at_viewer, official_alternate_costume, paw_gloves, red_ribbon, simple_background, solo, bare_shoulders, black_leotard, blush, fur_trim, hair_bell, open_mouth, tail_ribbon, small_breasts, tail_bell, white_background, ass, center_opening, fur_collar, heart, navel | | 10 | 13 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | looking_at_viewer, 1girl, floral_print, solo, blush, obi, pink_kimono, smile, print_kimono, wide_sleeves, blonde_hair, long_sleeves, closed_mouth, hair_flower, sidelocks | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | navel | official_alternate_costume | solo | flat_chest | purple_bikini | open_mouth | smile | blush | grey_hair | long_sleeves | looking_at_viewer | white_skirt | ascot | :d | purple_shirt | floating_hair | standing | frilled_skirt | boots | purple_footwear | full_body | pleated_skirt | closed_mouth | papakha | purple_capelet | purple_headwear | white_scarf | outdoors | purple_coat | snowing | coat | fur_hat | upper_body | simple_background | white_background | black_skirt | homurahara_academy_school_uniform | blonde_hair | white_socks | kneehighs | red_bow | red_ribbon | brown_headwear | collarbone | hat | shoes | white_sailor_collar | white_shirt | magical_girl | elbow_gloves | feather_hair_ornament | white_gloves | thighhighs | two_side_up | bare_shoulders | small_breasts | detached_sleeves | pink_footwear | thigh_boots | pink_dress | skirt | holding_wand | black_gloves | cat_ears | cat_tail | fake_animal_ears | hair_ribbon | jingle_bell | paw_gloves | black_leotard | fur_trim | hair_bell | tail_ribbon | tail_bell | ass | center_opening | fur_collar | heart | floral_print | obi | pink_kimono | print_kimono | wide_sleeves | hair_flower | sidelocks | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-----------------------------|:-------|:-------------|:----------------|:-------------|:--------|:--------|:------------|:---------------|:--------------------|:--------------|:--------|:-----|:---------------|:----------------|:-----------|:----------------|:--------|:------------------|:------------|:----------------|:---------------|:----------|:-----------------|:------------------|:--------------|:-----------|:--------------|:----------|:-------|:----------|:-------------|:--------------------|:-------------------|:--------------|:------------------------------------|:--------------|:--------------|:------------|:----------|:-------------|:-----------------|:-------------|:------|:--------|:----------------------|:--------------|:---------------|:---------------|:------------------------|:---------------|:-------------|:--------------|:-----------------|:----------------|:-------------------|:----------------|:--------------|:-------------|:--------|:---------------|:---------------|:-----------|:-----------|:-------------------|:--------------|:--------------|:-------------|:----------------|:-----------|:------------|:--------------|:------------|:------|:-----------------|:-------------|:--------|:---------------|:------|:--------------|:---------------|:---------------|:--------------|:------------| | 0 | 9 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 14 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | X | | | | X | | | X | X | X | X | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | X | | | X | | X | | X | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 15 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | | X | X | | | | | | | | | | | | | | | | | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 11 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | X | | | | | | | | | | | | | | | | X | X | | | | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | | X | | | | X | X | | X | X | | X | | X | | | | | | | | X | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 10 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | X | | | X | | X | | X | X | | | X | | | X | | | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 39 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | X | | | X | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | X | X | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | | | X | | | | | | | | X | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | 10 | 13 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | X | | | | X | X | | X | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
CyberHarem/illyasviel_von_einzbern_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T04:37:47+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-12T00:35:55+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of illyasviel\_von\_einzbern/イリヤスフィール・フォン・アインツベルン/伊莉雅丝菲尔·冯·爱因兹贝伦 (Fate/Grand Order) =========================================================================================== This is the dataset of illyasviel\_von\_einzbern/イリヤスフィール・フォン・アインツベルン/伊莉雅丝菲尔·冯·爱因兹贝伦 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'long\_hair, red\_eyes, white\_hair, bangs, hair\_between\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
27a6b548857b4af33a74428d1cafde2f43962f67
# Dataset Card for "instruction_merge_set" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
botp/LinkSoul-instruction_merge_set
[ "region:us" ]
2023-09-12T04:39:45+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "instruction", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13444870155, "num_examples": 10077297}], "download_size": 3542585235, "dataset_size": 13444870155}, "duplicated_from": "LinkSoul/instruction_merge_set"}
2023-09-12T04:39:45+00:00
[]
[]
TAGS #region-us
# Dataset Card for "instruction_merge_set" More Information needed
[ "# Dataset Card for \"instruction_merge_set\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"instruction_merge_set\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"instruction_merge_set\"\n\nMore Information needed" ]
5eee2dfca4b08aab292cafa5eefd7a64961933f4
## LawDataset-BUPT ⚖️ Here is the full data from the Legal LLM project, from which we hope to build a high quality dataset. Here's our [github project page](https://github.com/KLGR123/LegalLLM-BUPT). If you want to make any contribution, please contact me QQ 2248157602. ### Data Source Our data mainly comes from - CrimeKgAssistant, 856 crime KG items / 2800k crime name_entities / 200k lawQA with 13 classes - Tigerbot-law-plugin 55k laws provision data with 11 classes - Wenshu_ms_dataset 45k law judgements data - Lexilaw - LawGPT-zh 52k QA data - Lawyer_LLAMA law exam and instruction data - hualv_webste_QA 20k law QA data - baidu_zhidao_law_QA 36k law QA data - BELLE general dataset 1.5M For BELLE dataset and models, please download directly at [BELLE huggingface page](https://huggingface.co/datasets/BELLE-2/train_3.5M_CN_With_Category). ### Data Statistics So far the dataset size is around - Law QA data size: ~310k - Law provision data size: ~55k - Law judgement data size: ~45k - General data size: ~1500k ### Data Fields You can check the different data field for each source data. Wenshu_ms_dataset 45k law judgements data ``` { "Case": "王某甲与辽宁古田房地产有限公司房屋拆迁安置补偿合同纠纷一审民事判决书", "CaseId": "7abb676880254ca79c34a90e0101bc8e", "CaseProc": "民事一审", "CaseRecord": "原告王某甲与被告辽宁古田房地产有限公司房屋拆迁安置补偿合同纠纷一案,本院于2018年4月26日受理后,依法由审判员雷凯独任审判,公开开庭进行了审理。原告王某甲与被告辽宁古田房地产有限公司的委托代理人李某、刘某某到庭参加诉讼。本案现已审理终结", "CaseType": "民事案件", "JudgeAccusation": "原告王某甲诉称:原告原住大东区XX,2009年动迁至2014年回迁,至今被告没给原告房屋补助款。原告多次向被告主张房屋补助款,被告总是说没钱等等再等等。后来被告用这笔款给原告折抵五年物业费(从2015.1.1至2019.12.31),剩余房屋补助费3万多,到现在一直没解决,故起诉至法院。请求法院判令1、被告给付原告房屋拆迁款48000元;2、起诉费由被告承担。\n被告辽宁古田房地产有限公司辩称:针对原告诉讼请求48000元,被告对此不予认可,原、被告双方于2016年9月21日签订了协议书一份,对双方拆迁安置补助费的具体数额进行了重新确认,顶5年物业费后,尚欠安置费33828元。现原告诉讼请求48000元无法律依据,应按双方签订的协议书继续履行,该协议书系双方真实意思表示,具有法律效力。\n经审理查明:2008年7月25日,原被告签订城市房屋拆迁补偿安置协议。2016年9月21日,原告与被告签订协议书,该协议约定逾期安置补助费为48000元,原被告双方同意按百分之八十即38400元进行全部抵顶。其中4572元抵顶原告房屋五年的物业费(从2015年1月1日至2019年12月31日期间),剩余33828元待被告资金充足时解决。原告在庭审中自述从2015年至今没有缴纳过物业费。\n上述事实,有城市房屋拆迁补偿安置协议、协议书等证据及原被告陈述,经开庭质证,本院予以确认,在卷佐证", "JudgeReason": "本院认为:2016年9月21日,原告与被告签订协议书系双方真实的意思表示,内容不违反法律规定,合法有效,双方均应遵守。在该协议中,原被告协商一致在抵顶五年的物业费后,被告尚欠原告逾期安置补助费33828元,被告至今没有给付原告,故被告应当给付原告逾期安置补助费33828元。\n综上所述,根据《中华人民共和国合同法》第四十四条之规定,判决如下", "JudgeResult": "一、被告辽宁古田房地产有限公司于本判决生效后十日内给付原告王某甲逾期安置补助费33828元;\n二、驳回原告王某甲的其他诉讼请求。\n如被告未按本判决所指定的期限履行给付义务,则应当依照《中华人民共和国民事诉讼法》第二百五十三条之规定,加倍支付迟延履行期间的债务利息。\n案件受理费1000元,减半收取500元,由原告王某甲负担177元,由被告辽宁古田房地产有限公司负担323元。\n如不服本判决,可在判决书送达之日起15日内向本院递交上诉状,并按对方当事人的人数提出副本,交纳上诉案件受理费,上诉于辽宁省沈阳市中级人民法院。如上诉期满后7日内未交纳上诉案件受理费,按自动撤回上诉处理", "Keywords": [ "给付" ], "Parties": [ { "NameText": "王某甲", "Name": "王某甲", "LegalEntity": "Person", "Prop": "原告" }, { "NameText": "辽宁古田房地产有限公司", "Name": "辽宁古田房地产有限公司", "LegalEntity": "Corporation", "Prop": "被告" } ], "Category": { "cat_1": "房地产纠纷", "cat_2": "房产纠纷" } } ``` Tigerbot-law-plugin 55k laws provision data with 11 classes ``` {"type": "宪法", "title": "中华人民共和国宪法", "chapter1": "第一章 总纲", "content": "第六条 中华人民共和国的社会主义经济制度的基础是生产资料的社会主义公有制,即全民所有制和劳动群众集体所有制。社会主义公有制消灭人剥削人的制度,实行各尽所能、按劳分配的原则。\n国家在社会主义初级阶段,坚持公有制为主体、多种所有制经济共同发展的基本经济制度,坚持按劳分配为主体、多种分配方式并存的分配制度。", "chapter2": "", "chapter3": ""} ``` baidu_zhidao_law_QA 36k law QA data ``` title,question,reply,is_best 在法律中定金与订金的区别订金和定金哪个受,,“定金”是指当事人约定由一方向对方给付的,作为债权担保的一定数额的货币,它属于一种法律上的担保方式,目的在于促使债务人履行债务,保障债权人的债权得以实现。签合同时,对定金必需以书面形式进行约定,同时还应约定定金的数额和交付期限。给付定金一方如果不履行债务,无权要求另一方返还定金;接受定金的一方如果不履行债务,需向另一方双倍返还债务。债务人履行债务后,依照约定,定金应抵作价款或者收回。而“订金”目前我国法律没有明确规定,它不具备定金所具有的担保性质,可视为“预付款”,当合同不能履行时,除不可抗力外,应根据双方当事人的过错承担违约责任。,1 ``` CrimeKgAssistant ``` { "completion": "根据相关法律规定,未满14岁的人不得驾驶机动车辆,骑行电动车也属于机动车范畴。因此,未成年捏了电动车的刹车,可以认定为违法行为。同时,成年骑电动车的人也应当承担相应的民事责任。", "prompt": "一14岁未成年骑自行车与一成年骑电动车相撞,T形路,未成年拐弯,未成年捏了刹车的电动车属于机动车吗??" } ``` JEC-QA ``` {"answer": ["D"], "id": "3_2613", "option_list": {"A": "因未办理收养登记,包某与陈煜之间不存在法律上父子关系", "B": "陈煜作为包某生前抚养且无经济来源的人,可适当分得包某遗产", "C": "陈某的遗产由洪某与陈婴继承,陈煜不能继承", "D": "陈煜既可以继承陈某的遗产,也可以继承包某的遗产"}, "statement": "陈某与潘某离婚后,潘某带着2岁的儿子陈煜改嫁包某。陈某、潘某、包某三人订立收养协议,陈煜由包某收养,今后一切与陈某概无关系,但未办理收养登记。5年后,潘某与包某生下一女,取名包红。陈某离婚后,与洪某结婚,生女取名陈婴。几年后,陈某、包某相继去世。下列说法中正确的是:", "type": "1"} ```
bupt/LawDataset-BUPT
[ "size_categories:1M<n<10M", "language:zh", "legal", "region:us" ]
2023-09-12T04:55:37+00:00
{"language": ["zh"], "size_categories": ["1M<n<10M"], "pretty_name": "LawDataset-BPUT", "tags": ["legal"]}
2023-11-11T13:23:04+00:00
[]
[ "zh" ]
TAGS #size_categories-1M<n<10M #language-Chinese #legal #region-us
## LawDataset-BUPT ️ Here is the full data from the Legal LLM project, from which we hope to build a high quality dataset. Here's our github project page. If you want to make any contribution, please contact me QQ 2248157602. ### Data Source Our data mainly comes from - CrimeKgAssistant, 856 crime KG items / 2800k crime name_entities / 200k lawQA with 13 classes - Tigerbot-law-plugin 55k laws provision data with 11 classes - Wenshu_ms_dataset 45k law judgements data - Lexilaw - LawGPT-zh 52k QA data - Lawyer_LLAMA law exam and instruction data - hualv_webste_QA 20k law QA data - baidu_zhidao_law_QA 36k law QA data - BELLE general dataset 1.5M For BELLE dataset and models, please download directly at BELLE huggingface page. ### Data Statistics So far the dataset size is around - Law QA data size: ~310k - Law provision data size: ~55k - Law judgement data size: ~45k - General data size: ~1500k ### Data Fields You can check the different data field for each source data. Wenshu_ms_dataset 45k law judgements data Tigerbot-law-plugin 55k laws provision data with 11 classes baidu_zhidao_law_QA 36k law QA data CrimeKgAssistant JEC-QA
[ "## LawDataset-BUPT ️\nHere is the full data from the Legal LLM project, from which we hope to build a high quality dataset.\n\nHere's our github project page.\n\nIf you want to make any contribution, please contact me QQ 2248157602.", "### Data Source\nOur data mainly comes from\n- CrimeKgAssistant, 856 crime KG items / 2800k crime name_entities / 200k lawQA with 13 classes\n- Tigerbot-law-plugin 55k laws provision data with 11 classes\n- Wenshu_ms_dataset 45k law judgements data\n- Lexilaw\n - LawGPT-zh 52k QA data\n - Lawyer_LLAMA law exam and instruction data\n - hualv_webste_QA 20k law QA data \n - baidu_zhidao_law_QA 36k law QA data\n- BELLE general dataset 1.5M\n\nFor BELLE dataset and models, please download directly at BELLE huggingface page.", "### Data Statistics\nSo far the dataset size is around\n- Law QA data size: ~310k\n- Law provision data size: ~55k\n- Law judgement data size: ~45k\n- General data size: ~1500k", "### Data Fields\nYou can check the different data field for each source data.\n\nWenshu_ms_dataset 45k law judgements data\n\n\nTigerbot-law-plugin 55k laws provision data with 11 classes\n\n\nbaidu_zhidao_law_QA 36k law QA data\n\n\nCrimeKgAssistant\n\n\nJEC-QA" ]
[ "TAGS\n#size_categories-1M<n<10M #language-Chinese #legal #region-us \n", "## LawDataset-BUPT ️\nHere is the full data from the Legal LLM project, from which we hope to build a high quality dataset.\n\nHere's our github project page.\n\nIf you want to make any contribution, please contact me QQ 2248157602.", "### Data Source\nOur data mainly comes from\n- CrimeKgAssistant, 856 crime KG items / 2800k crime name_entities / 200k lawQA with 13 classes\n- Tigerbot-law-plugin 55k laws provision data with 11 classes\n- Wenshu_ms_dataset 45k law judgements data\n- Lexilaw\n - LawGPT-zh 52k QA data\n - Lawyer_LLAMA law exam and instruction data\n - hualv_webste_QA 20k law QA data \n - baidu_zhidao_law_QA 36k law QA data\n- BELLE general dataset 1.5M\n\nFor BELLE dataset and models, please download directly at BELLE huggingface page.", "### Data Statistics\nSo far the dataset size is around\n- Law QA data size: ~310k\n- Law provision data size: ~55k\n- Law judgement data size: ~45k\n- General data size: ~1500k", "### Data Fields\nYou can check the different data field for each source data.\n\nWenshu_ms_dataset 45k law judgements data\n\n\nTigerbot-law-plugin 55k laws provision data with 11 classes\n\n\nbaidu_zhidao_law_QA 36k law QA data\n\n\nCrimeKgAssistant\n\n\nJEC-QA" ]
[ 25, 60, 154, 50, 71 ]
[ "passage: TAGS\n#size_categories-1M<n<10M #language-Chinese #legal #region-us \n## LawDataset-BUPT ️\nHere is the full data from the Legal LLM project, from which we hope to build a high quality dataset.\n\nHere's our github project page.\n\nIf you want to make any contribution, please contact me QQ 2248157602.### Data Source\nOur data mainly comes from\n- CrimeKgAssistant, 856 crime KG items / 2800k crime name_entities / 200k lawQA with 13 classes\n- Tigerbot-law-plugin 55k laws provision data with 11 classes\n- Wenshu_ms_dataset 45k law judgements data\n- Lexilaw\n - LawGPT-zh 52k QA data\n - Lawyer_LLAMA law exam and instruction data\n - hualv_webste_QA 20k law QA data \n - baidu_zhidao_law_QA 36k law QA data\n- BELLE general dataset 1.5M\n\nFor BELLE dataset and models, please download directly at BELLE huggingface page.### Data Statistics\nSo far the dataset size is around\n- Law QA data size: ~310k\n- Law provision data size: ~55k\n- Law judgement data size: ~45k\n- General data size: ~1500k### Data Fields\nYou can check the different data field for each source data.\n\nWenshu_ms_dataset 45k law judgements data\n\n\nTigerbot-law-plugin 55k laws provision data with 11 classes\n\n\nbaidu_zhidao_law_QA 36k law QA data\n\n\nCrimeKgAssistant\n\n\nJEC-QA" ]
086992bb7dc0e1ae6aed09ba52b2f05117b676b0
# Lora FMG-9 (LLaMA2) 09122023 test 1 * Dataset of FMG-9 dialogue from Girls' Frontline * Manually edited to turn into multi-turn dialogue
922-CA/lf2_09122023_test1
[ "license:openrail", "region:us" ]
2023-09-12T05:11:42+00:00
{"license": "openrail"}
2023-09-22T07:08:59+00:00
[]
[]
TAGS #license-openrail #region-us
# Lora FMG-9 (LLaMA2) 09122023 test 1 * Dataset of FMG-9 dialogue from Girls' Frontline * Manually edited to turn into multi-turn dialogue
[ "# Lora FMG-9 (LLaMA2) 09122023 test 1\n* Dataset of FMG-9 dialogue from Girls' Frontline\n* Manually edited to turn into multi-turn dialogue" ]
[ "TAGS\n#license-openrail #region-us \n", "# Lora FMG-9 (LLaMA2) 09122023 test 1\n* Dataset of FMG-9 dialogue from Girls' Frontline\n* Manually edited to turn into multi-turn dialogue" ]
[ 12, 43 ]
[ "passage: TAGS\n#license-openrail #region-us \n# Lora FMG-9 (LLaMA2) 09122023 test 1\n* Dataset of FMG-9 dialogue from Girls' Frontline\n* Manually edited to turn into multi-turn dialogue" ]
a8a8c8afdb0b1f51df55019edf90a4f7c63538e1
# Dataset of jougasaki_mika/城ヶ崎美嘉/죠가사키미카 (THE iDOLM@STER: Cinderella Girls) This is the dataset of jougasaki_mika/城ヶ崎美嘉/죠가사키미카 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are `pink_hair, yellow_eyes, breasts, bangs, long_hair, bow, medium_breasts, hair_bow, hair_between_eyes, ponytail, earrings`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 793.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jougasaki_mika_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 417.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jougasaki_mika_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1307 | 941.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jougasaki_mika_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 682.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jougasaki_mika_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1307 | 1.36 GiB | [Download](https://huggingface.co/datasets/CyberHarem/jougasaki_mika_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/jougasaki_mika_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_gloves, looking_at_viewer, purple_dress, solo, bare_shoulders, strapless_dress, blush, cleavage, bracelet, necklace, pantyhose, side_ponytail, smile, tattoo | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, blush, looking_at_viewer, smile, solo, holding_bouquet, white_dress, cleavage, petals, rose, hair_flower, pearl_necklace, strapless_dress, wedding_dress, yellow_flower | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, looking_at_viewer, solo, necklace, bare_shoulders, blush | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, blush, short_hair, solo, looking_at_viewer, smile, twintails, sitting | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blush, cleavage, looking_at_viewer, solo, collarbone, off_shoulder, smile, nightgown, sitting, strap_slip | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, blush, looking_at_viewer, solo, cleavage, collarbone, simple_background, white_background, black_bra, jewelry, smile, upper_body, bare_shoulders, large_breasts, off_shoulder, parted_lips, shirt | | 6 | 15 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bikini, cleavage, solo, hair_flower, looking_at_viewer, navel, jewelry, smile, twintails, blush, bare_shoulders, collarbone, pink_jacket, off_shoulder, short_shorts, fingerless_gloves, floral_print, open_jacket, open_mouth, water, white_gloves | | 7 | 18 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, looking_at_viewer, simple_background, solo, white_background, cleavage, collarbone, elbow_gloves, smile, thighs, black_gloves, black_bikini, black_thighhighs, micro_bikini, navel, side-tie_bikini_bottom, bare_shoulders, choker, large_breasts, open_mouth, sitting, twintails | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, looking_at_viewer, navel, outdoors, solo, blue_sky, cleavage, cloud, collarbone, cowboy_shot, day, halterneck, ocean, string_bikini, bare_shoulders, bracelet, beach, large_breasts, open_mouth, pink_bikini, wet, :d, grin, multi-strapped_bikini, nail_polish, necklace, red_bikini, side-tie_bikini_bottom, stomach, water | | 9 | 8 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, looking_at_viewer, solo, blush, grin, one_eye_closed, heart, jewelry, bespectacled, cleavage | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, bare_shoulders, black_footwear, cleavage, looking_at_viewer, solo, thigh_boots, bracelet, grin, midriff, navel, simple_background, white_background, belt, black_thighhighs, blush, collarbone, eyelashes, high_heel_boots, peace_symbol, twintails, black_bow, black_shorts, buckle, choker, crop_top, cross-laced_legwear, full_body, heart_earrings, lace-up_boots, large_breasts, nail_polish, shiny, shoe_soles, short_shorts, sidelocks, sitting, sleeveless, star_(symbol), stomach, underboob | | 11 | 13 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, school_uniform, solo, sweater_around_waist, looking_at_viewer, ring, shirt, grin, plaid_skirt, blush, button_badge | | 12 | 9 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | 1girl, solo, bowtie, looking_at_viewer, school_uniform, white_shirt, button_badge, collarbone, collared_shirt, ring, smiley_face, sweater_around_waist, white_background, blush, grin, striped_bow, hand_on_hip, nail_polish, purple_bow, simple_background, plaid_skirt, pleated_skirt, sidelocks | | 13 | 5 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | 1girl, black_gloves, blush, china_dress, red_dress, simple_background, white_background, bare_shoulders, black_panties, large_breasts, side-tie_panties, sleeveless_dress, solo, black_bra, closed_mouth, hair_down, lace_gloves, looking_at_viewer, alternate_hairstyle, bare_legs, black_footwear, cleavage_cutout, covered_navel, cowboy_shot, jewelry, pelvic_curtain, straight_hair, sweatdrop, thighs | | 14 | 21 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | blush, nipples, 1girl, hetero, 1boy, large_breasts, open_mouth, solo_focus, sex, sweat, penis, vaginal, navel, spread_legs, completely_nude, mosaic_censoring, cum_in_pussy, collarbone, looking_at_viewer | | 15 | 8 | ![](samples/15/clu15-sample0.png) | ![](samples/15/clu15-sample1.png) | ![](samples/15/clu15-sample2.png) | ![](samples/15/clu15-sample3.png) | ![](samples/15/clu15-sample4.png) | 1girl, large_breasts, nurse_cap, blush, cleavage_cutout, heart_cutout, looking_at_viewer, solo, heart_earrings, puffy_short_sleeves, stethoscope, 1boy, frills, grin, hetero, mosaic_censoring, penis, sitting, white_dress, white_thighhighs, wrist_cuffs | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | looking_at_viewer | purple_dress | solo | bare_shoulders | strapless_dress | blush | cleavage | bracelet | necklace | pantyhose | side_ponytail | smile | tattoo | holding_bouquet | white_dress | petals | rose | hair_flower | pearl_necklace | wedding_dress | yellow_flower | short_hair | twintails | sitting | collarbone | off_shoulder | nightgown | strap_slip | simple_background | white_background | black_bra | jewelry | upper_body | large_breasts | parted_lips | shirt | bikini | navel | pink_jacket | short_shorts | fingerless_gloves | floral_print | open_jacket | open_mouth | water | white_gloves | elbow_gloves | thighs | black_bikini | black_thighhighs | micro_bikini | side-tie_bikini_bottom | choker | outdoors | blue_sky | cloud | cowboy_shot | day | halterneck | ocean | string_bikini | beach | pink_bikini | wet | :d | grin | multi-strapped_bikini | nail_polish | red_bikini | stomach | one_eye_closed | heart | bespectacled | black_footwear | thigh_boots | midriff | belt | eyelashes | high_heel_boots | peace_symbol | black_bow | black_shorts | buckle | crop_top | cross-laced_legwear | full_body | heart_earrings | lace-up_boots | shiny | shoe_soles | sidelocks | sleeveless | star_(symbol) | underboob | school_uniform | sweater_around_waist | ring | plaid_skirt | button_badge | bowtie | white_shirt | collared_shirt | smiley_face | striped_bow | hand_on_hip | purple_bow | pleated_skirt | china_dress | red_dress | black_panties | side-tie_panties | sleeveless_dress | closed_mouth | hair_down | lace_gloves | alternate_hairstyle | bare_legs | cleavage_cutout | covered_navel | pelvic_curtain | straight_hair | sweatdrop | nipples | hetero | 1boy | solo_focus | sex | sweat | penis | vaginal | spread_legs | completely_nude | mosaic_censoring | cum_in_pussy | nurse_cap | heart_cutout | puffy_short_sleeves | stethoscope | frills | white_thighhighs | wrist_cuffs | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:--------------------|:---------------|:-------|:-----------------|:------------------|:--------|:-----------|:-----------|:-----------|:------------|:----------------|:--------|:---------|:------------------|:--------------|:---------|:-------|:--------------|:-----------------|:----------------|:----------------|:-------------|:------------|:----------|:-------------|:---------------|:------------|:-------------|:--------------------|:-------------------|:------------|:----------|:-------------|:----------------|:--------------|:--------|:---------|:--------|:--------------|:---------------|:--------------------|:---------------|:--------------|:-------------|:--------|:---------------|:---------------|:---------|:---------------|:-------------------|:---------------|:-------------------------|:---------|:-----------|:-----------|:--------|:--------------|:------|:-------------|:--------|:----------------|:--------|:--------------|:------|:-----|:-------|:------------------------|:--------------|:-------------|:----------|:-----------------|:--------|:---------------|:-----------------|:--------------|:----------|:-------|:------------|:------------------|:---------------|:------------|:---------------|:---------|:-----------|:----------------------|:------------|:-----------------|:----------------|:--------|:-------------|:------------|:-------------|:----------------|:------------|:-----------------|:-----------------------|:-------|:--------------|:---------------|:---------|:--------------|:-----------------|:--------------|:--------------|:--------------|:-------------|:----------------|:--------------|:------------|:----------------|:-------------------|:-------------------|:---------------|:------------|:--------------|:----------------------|:------------|:------------------|:----------------|:-----------------|:----------------|:------------|:----------|:---------|:-------|:-------------|:------|:--------|:--------|:----------|:--------------|:------------------|:-------------------|:---------------|:------------|:---------------|:----------------------|:--------------|:---------|:-------------------|:--------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | X | X | X | X | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 6 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | X | | | X | | | | | | X | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | X | | | X | X | | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | X | X | | X | X | | | | | X | | | | | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 15 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | X | X | | X | X | | | | | X | | | | | | X | | | | | X | | X | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 18 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | | X | X | | X | X | | | | | X | | | | | | | | | | | X | X | X | | | | X | X | | | | X | | | | X | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | X | | | | | | X | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 8 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 5 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | X | | X | X | | X | X | X | | | | | | | | | | | | | | | X | X | X | | | | X | X | | | | X | | | | X | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 11 | 13 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 12 | 9 | ![](samples/12/clu12-sample0.png) | ![](samples/12/clu12-sample1.png) | ![](samples/12/clu12-sample2.png) | ![](samples/12/clu12-sample3.png) | ![](samples/12/clu12-sample4.png) | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 13 | 5 | ![](samples/13/clu13-sample0.png) | ![](samples/13/clu13-sample1.png) | ![](samples/13/clu13-sample2.png) | ![](samples/13/clu13-sample3.png) | ![](samples/13/clu13-sample4.png) | X | X | X | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 14 | 21 | ![](samples/14/clu14-sample0.png) | ![](samples/14/clu14-sample1.png) | ![](samples/14/clu14-sample2.png) | ![](samples/14/clu14-sample3.png) | ![](samples/14/clu14-sample4.png) | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | 15 | 8 | ![](samples/15/clu15-sample0.png) | ![](samples/15/clu15-sample1.png) | ![](samples/15/clu15-sample2.png) | ![](samples/15/clu15-sample3.png) | ![](samples/15/clu15-sample4.png) | X | | X | | X | | | X | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | | | | X | | | | X | | X | X | X | X | X | X | X |
CyberHarem/jougasaki_mika_idolmastercinderellagirls
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T05:16:18+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T11:35:16+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of jougasaki\_mika/城ヶ崎美嘉/죠가사키미카 (THE iDOLM@STER: Cinderella Girls) ========================================================================== This is the dataset of jougasaki\_mika/城ヶ崎美嘉/죠가사키미카 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are 'pink\_hair, yellow\_eyes, breasts, bangs, long\_hair, bow, medium\_breasts, hair\_bow, hair\_between\_eyes, ponytail, earrings', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
bec275e88f4cc2fba3b138e4f95c6d660478a357
# Dataset of cattleya/カトレア (Pokémon) This is the dataset of cattleya/カトレア (Pokémon), containing 367 images and their tags. The core tags of this character are `long_hair, blonde_hair, hat, very_long_hair, breasts, green_eyes, blue_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 367 | 314.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 367 | 206.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 690 | 368.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 367 | 287.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 690 | 487.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cattleya_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/cattleya_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, nude, solo, blush, navel, nipples, small_breasts, pussy, smile | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1boy, 1girl, blush, hetero, sex, solo_focus, vaginal, censored, nipples, penis, spread_legs, nude, open_mouth, small_breasts, smile, cum_in_pussy, navel, on_back | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, hetero, nipples, nude, penis, solo_focus, 1boy, small_breasts, fellatio, medium_breasts, uncensored | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1boy, 1girl, handjob, hetero, penis, solo_focus, bar_censor, blush, nipples, nude, pointless_censoring, smile | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1boy, 1girl, blush, hetero, mosaic_censoring, nipples, paizuri, penis, solo_focus, cum_on_breasts, looking_at_viewer, on_back, open_mouth, heart, huge_breasts, pov, white_headwear, ;o, bare_shoulders, large_breasts, one_eye_closed, parted_bangs, shirt, speech_bubble | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, dress, solo | | 6 | 14 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, long_sleeves, parted_bangs, eyelashes, looking_at_viewer, closed_mouth, pink_footwear, shoes, pink_headwear, collarbone, pink_dress, pokemon_(creature), solo, full_body, white_headwear, sitting | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, dress, long_sleeves, looking_at_viewer, solo, eyelashes, parted_bangs, hand_up, open_mouth, pink_headwear, aqua_eyes | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, black_dress, hair_ornament, looking_at_viewer, official_alternate_costume, sidelocks, blush, detached_sleeves, parted_bangs, pokemon_(creature), ponytail, black_choker, closed_mouth, eyelashes, pantyhose, bare_shoulders, red_gemstone | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, pokemon_(creature), lying, sleeping, brown_hair, closed_eyes, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | nude | solo | blush | navel | nipples | small_breasts | pussy | smile | 1boy | hetero | sex | solo_focus | vaginal | censored | penis | spread_legs | open_mouth | cum_in_pussy | on_back | fellatio | medium_breasts | uncensored | handjob | bar_censor | pointless_censoring | mosaic_censoring | paizuri | cum_on_breasts | looking_at_viewer | heart | huge_breasts | pov | white_headwear | ;o | bare_shoulders | large_breasts | one_eye_closed | parted_bangs | shirt | speech_bubble | dress | long_sleeves | eyelashes | closed_mouth | pink_footwear | shoes | pink_headwear | collarbone | pink_dress | pokemon_(creature) | full_body | sitting | hand_up | aqua_eyes | black_dress | hair_ornament | official_alternate_costume | sidelocks | detached_sleeves | ponytail | black_choker | pantyhose | red_gemstone | lying | sleeping | brown_hair | closed_eyes | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:--------|:--------|:----------|:----------------|:--------|:--------|:-------|:---------|:------|:-------------|:----------|:-----------|:--------|:--------------|:-------------|:---------------|:----------|:-----------|:-----------------|:-------------|:----------|:-------------|:----------------------|:-------------------|:----------|:-----------------|:--------------------|:--------|:---------------|:------|:-----------------|:-----|:-----------------|:----------------|:-----------------|:---------------|:--------|:----------------|:--------|:---------------|:------------|:---------------|:----------------|:--------|:----------------|:-------------|:-------------|:---------------------|:------------|:----------|:----------|:------------|:--------------|:----------------|:-----------------------------|:------------|:-------------------|:-----------|:---------------|:------------|:---------------|:--------|:-----------|:-------------|:--------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | | X | X | | | X | X | | X | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | X | | X | | | X | X | X | | X | | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | | X | | X | | | | X | X | | X | | | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 14 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | X | | | X | X | X | | | | X | | | | | | X | X | | | | | | | | | | | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | X | | | | | X | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | 9 | 6 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X |
CyberHarem/cattleya_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T05:17:36+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T20:57:39+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of cattleya/カトレア (Pokémon) ================================== This is the dataset of cattleya/カトレア (Pokémon), containing 367 images and their tags. The core tags of this character are 'long\_hair, blonde\_hair, hat, very\_long\_hair, breasts, green\_eyes, blue\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
6c68d7b8da9b3e590dd884ad1315f3366d4cf816
# Dataset Card for "c4_biomedicalacademic_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zxvix/c4_biomedicalacademic_2
[ "region:us" ]
2023-09-12T05:20:56+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "timestamp", "dtype": "timestamp[s]"}, {"name": "url", "dtype": "string"}, {"name": "original_text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3027569.0, "num_examples": 989}], "download_size": 1767632, "dataset_size": 3027569.0}}
2023-09-12T05:21:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for "c4_biomedicalacademic_2" More Information needed
[ "# Dataset Card for \"c4_biomedicalacademic_2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"c4_biomedicalacademic_2\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"c4_biomedicalacademic_2\"\n\nMore Information needed" ]
c15b008aedd844cd58ceeb1d5d13bfc77d8f4e22
# Dataset Card for Evaluation run of Sao10K/Stheno-1.2-L2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Sao10K/Stheno-1.2-L2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-1.2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-29T06:49:47.166294](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B/blob/main/results_2023-10-29T06-49-47.166294.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.397126677852349, "em_stderr": 0.005010917075875424, "f1": 0.4671539429530222, "f1_stderr": 0.0047944933216487965, "acc": 0.42948814994019174, "acc_stderr": 0.01038154947148015 }, "harness|drop|3": { "em": 0.397126677852349, "em_stderr": 0.005010917075875424, "f1": 0.4671539429530222, "f1_stderr": 0.0047944933216487965 }, "harness|gsm8k|5": { "acc": 0.10917361637604246, "acc_stderr": 0.008590089300511146 }, "harness|winogrande|5": { "acc": 0.749802683504341, "acc_stderr": 0.012173009642449155 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B
[ "region:us" ]
2023-09-12T05:46:53+00:00
{"pretty_name": "Evaluation run of Sao10K/Stheno-1.2-L2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-1.2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T06:49:47.166294](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.2-L2-13B/blob/main/results_2023-10-29T06-49-47.166294.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.397126677852349,\n \"em_stderr\": 0.005010917075875424,\n \"f1\": 0.4671539429530222,\n \"f1_stderr\": 0.0047944933216487965,\n \"acc\": 0.42948814994019174,\n \"acc_stderr\": 0.01038154947148015\n },\n \"harness|drop|3\": {\n \"em\": 0.397126677852349,\n \"em_stderr\": 0.005010917075875424,\n \"f1\": 0.4671539429530222,\n \"f1_stderr\": 0.0047944933216487965\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10917361637604246,\n \"acc_stderr\": 0.008590089300511146\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.749802683504341,\n \"acc_stderr\": 0.012173009642449155\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Stheno-1.2-L2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|arc:challenge|25_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T06_49_47.166294", "path": ["**/details_harness|drop|3_2023-10-29T06-49-47.166294.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T06-49-47.166294.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T06_49_47.166294", "path": ["**/details_harness|gsm8k|5_2023-10-29T06-49-47.166294.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T06-49-47.166294.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hellaswag|10_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T06-46-37.023580.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T06-46-37.023580.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T06_49_47.166294", "path": ["**/details_harness|winogrande|5_2023-10-29T06-49-47.166294.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T06-49-47.166294.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T06_46_37.023580", "path": ["results_2023-09-12T06-46-37.023580.parquet"]}, {"split": "2023_10_29T06_49_47.166294", "path": ["results_2023-10-29T06-49-47.166294.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T06-49-47.166294.parquet"]}]}]}
2023-10-29T06:49:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Sao10K/Stheno-1.2-L2-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Sao10K/Stheno-1.2-L2-13B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-29T06:49:47.166294(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Sao10K/Stheno-1.2-L2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-1.2-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T06:49:47.166294(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Sao10K/Stheno-1.2-L2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-1.2-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-29T06:49:47.166294(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Stheno-1.2-L2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-1.2-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T06:49:47.166294(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
5166ba86516083f8189e5d0d8c36ab6a758eca00
# Dataset Card for "autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
yzhuang/autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0
[ "region:us" ]
2023-09-12T05:53:51+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "input_x", "sequence": {"sequence": "float32"}}, {"name": "input_y", "sequence": {"sequence": "float32"}}, {"name": "input_y_clean", "sequence": {"sequence": "float32"}}, {"name": "rtg", "sequence": "float64"}, {"name": "status", "sequence": {"sequence": "float32"}}, {"name": "split_threshold", "sequence": {"sequence": "float32"}}, {"name": "split_dimension", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 2057200000, "num_examples": 100000}, {"name": "validation", "num_bytes": 205720000, "num_examples": 10000}], "download_size": 257403365, "dataset_size": 2262920000}}
2023-09-12T05:54:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for "autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0" More Information needed
[ "# Dataset Card for \"autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0\"\n\nMore Information needed" ]
[ 6, 40 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0\"\n\nMore Information needed" ]
da70e68bf983ee04c625629873e3b8085364c894
1. Training Examples: 9000 ids in total 2. Testing Examples: 394 ids in total
Tyrael/key_info_simple
[ "license:other", "region:us" ]
2023-09-12T05:59:54+00:00
{"license": "other"}
2023-09-12T06:01:49+00:00
[]
[]
TAGS #license-other #region-us
1. Training Examples: 9000 ids in total 2. Testing Examples: 394 ids in total
[]
[ "TAGS\n#license-other #region-us \n" ]
[ 11 ]
[ "passage: TAGS\n#license-other #region-us \n" ]
5f15d69e4aafe7fe1a55aeff8814d9d2521dcf5a
# Dataset Card for "marble_prompts" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Falah/marble_prompts
[ "region:us" ]
2023-09-12T06:01:43+00:00
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3029313, "num_examples": 10000}], "download_size": 380891, "dataset_size": 3029313}}
2023-09-12T06:45:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for "marble_prompts" More Information needed
[ "# Dataset Card for \"marble_prompts\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"marble_prompts\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"marble_prompts\"\n\nMore Information needed" ]
befc7d408701bc8e0dd2cfe65af18b0b1f39b95b
covert from `https://huggingface.co/datasets/mcipriano/stackoverflow-kubernetes-questions/blob/main/README.md` format from parquet to csv coverting code as below ``` import pandas as pd from pandas import read_parquet data = read_parquet("~/Downloads/kubernetes_dump.parquet") #print(data.count()) #data.head() data.to_csv('/tmp/out.csv', index=False) ```
peterpanpan/stackoverflow-kubernetes-questions
[ "license:apache-2.0", "region:us" ]
2023-09-12T06:26:26+00:00
{"license": "apache-2.0"}
2023-09-12T06:28:38+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
covert from 'URL format from parquet to csv coverting code as below
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n" ]
67f5aa696d6b27c9352f80a7d38b7a8befbfa3b2
# SmallSentiment-1 Created from AIOD platform
mtkinit/SmallSentiment-1
[ "region:us" ]
2023-09-12T06:35:20+00:00
{"pretty_name": "SmallSentiment-1"}
2023-09-12T06:35:22+00:00
[]
[]
TAGS #region-us
# SmallSentiment-1 Created from AIOD platform
[ "# SmallSentiment-1\nCreated from AIOD platform" ]
[ "TAGS\n#region-us \n", "# SmallSentiment-1\nCreated from AIOD platform" ]
[ 6, 12 ]
[ "passage: TAGS\n#region-us \n# SmallSentiment-1\nCreated from AIOD platform" ]
5ee547f6ac05f15787734608fb93739438226a4a
# Dataset of cogito/コギト (Pokémon) This is the dataset of cogito/コギト (Pokémon), containing 237 images and their tags. The core tags of this character are `hair_over_one_eye, hat, breasts, black_headwear, grey_hair, short_hair, mature_female, grey_eyes, white_hair, large_breasts, huge_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 237 | 299.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cogito_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 237 | 159.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cogito_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 551 | 322.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cogito_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 237 | 260.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cogito_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 551 | 480.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cogito_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/cogito_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, collarbone, looking_at_viewer, navel, nipples, nude, solo, pussy, smile, closed_mouth, curvy, thighs, cleft_of_venus, uncensored | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_dress, solo, white_gloves, collarbone, looking_at_viewer, smile, hand_up, closed_mouth, upper_body, black_choker, cleavage, eyelashes | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1boy, 1girl, hetero, blush, penis, pov, collarbone, nipples, looking_at_viewer, navel, smile, sweat, pussy, sex, vaginal, completely_nude, mosaic_censoring, paizuri, solo_focus, white_gloves, closed_mouth, cowgirl_position, cum, flower, girl_on_top, green_eyes, medium_hair, open_mouth, spread_legs, uncensored | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collarbone | looking_at_viewer | navel | nipples | nude | solo | pussy | smile | closed_mouth | curvy | thighs | cleft_of_venus | uncensored | black_dress | white_gloves | hand_up | upper_body | black_choker | cleavage | eyelashes | 1boy | hetero | blush | penis | pov | sweat | sex | vaginal | completely_nude | mosaic_censoring | paizuri | solo_focus | cowgirl_position | cum | flower | girl_on_top | green_eyes | medium_hair | open_mouth | spread_legs | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:--------|:----------|:-------|:-------|:--------|:--------|:---------------|:--------|:---------|:-----------------|:-------------|:--------------|:---------------|:----------|:-------------|:---------------|:-----------|:------------|:-------|:---------|:--------|:--------|:------|:--------|:------|:----------|:------------------|:-------------------|:----------|:-------------|:-------------------|:------|:---------|:--------------|:-------------|:--------------|:-------------|:--------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | | X | | X | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | 2 | 11 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | | | X | X | X | | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/cogito_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T06:37:45+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T20:41:50+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of cogito/コギト (Pokémon) =============================== This is the dataset of cogito/コギト (Pokémon), containing 237 images and their tags. The core tags of this character are 'hair\_over\_one\_eye, hat, breasts, black\_headwear, grey\_hair, short\_hair, mature\_female, grey\_eyes, white\_hair, large\_breasts, huge\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
4a101675c77357299abacf72f44820c678b4ebb7
# Dataset Card for Evaluation run of microsoft/phi-1_5 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/microsoft/phi-1_5 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [microsoft/phi-1_5](https://huggingface.co/microsoft/phi-1_5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_microsoft__phi-1_5_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-06T13:16:44.433412](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__phi-1_5_public/blob/main/results_2023-11-06T13-16-44.433412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0025167785234899327, "em_stderr": 0.0005131152834514706, "f1": 0.050449874161074076, "f1_stderr": 0.0013066024783407934, "acc": 0.4232574963812843, "acc_stderr": 0.01083889957295003 }, "harness|drop|3": { "em": 0.0025167785234899327, "em_stderr": 0.0005131152834514706, "f1": 0.050449874161074076, "f1_stderr": 0.0013066024783407934 }, "harness|gsm8k|5": { "acc": 0.1243366186504928, "acc_stderr": 0.009088880962028463 }, "harness|winogrande|5": { "acc": 0.7221783741120757, "acc_stderr": 0.0125889181838716 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_microsoft__phi-1_5
[ "region:us" ]
2023-09-12T07:08:37+00:00
{"pretty_name": "Evaluation run of microsoft/phi-1_5", "dataset_summary": "Dataset automatically created during the evaluation run of model [microsoft/phi-1_5](https://huggingface.co/microsoft/phi-1_5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__phi-1_5_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-06T13:16:44.433412](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__phi-1_5_public/blob/main/results_2023-11-06T13-16-44.433412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514706,\n \"f1\": 0.050449874161074076,\n \"f1_stderr\": 0.0013066024783407934,\n \"acc\": 0.4232574963812843,\n \"acc_stderr\": 0.01083889957295003\n },\n \"harness|drop|3\": {\n \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514706,\n \"f1\": 0.050449874161074076,\n \"f1_stderr\": 0.0013066024783407934\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1243366186504928,\n \"acc_stderr\": 0.009088880962028463\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.0125889181838716\n }\n}\n```", "repo_url": "https://huggingface.co/microsoft/phi-1_5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_06T13_16_44.433412", "path": ["**/details_harness|drop|3_2023-11-06T13-16-44.433412.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-06T13-16-44.433412.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_06T13_16_44.433412", "path": ["**/details_harness|gsm8k|5_2023-11-06T13-16-44.433412.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-06T13-16-44.433412.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_06T13_16_44.433412", "path": ["**/details_harness|winogrande|5_2023-11-06T13-16-44.433412.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-06T13-16-44.433412.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_06T13_16_44.433412", "path": ["results_2023-11-06T13-16-44.433412.parquet"]}, {"split": "latest", "path": ["results_2023-11-06T13-16-44.433412.parquet"]}]}]}
2023-12-01T14:38:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of microsoft/phi-1_5 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model microsoft/phi-1_5 on the Open LLM Leaderboard. The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-11-06T13:16:44.433412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of microsoft/phi-1_5", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/phi-1_5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-06T13:16:44.433412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of microsoft/phi-1_5", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/phi-1_5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-06T13:16:44.433412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 16, 31, 165, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of microsoft/phi-1_5## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/phi-1_5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-06T13:16:44.433412(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
2bf151c15297ab8f78d7002b10e10222714b3689
This dataset is a subset of the Open Assistant dataset, which you can find here: https://huggingface.co/datasets/OpenAssistant/oasst1/tree/main This subset of the data only contains the highest-rated paths in the conversation tree, with a total of 9,846 samples. This dataset was used to train Guanaco with QLoRA. For further information, please see the original dataset. License: Apache 2.0
pavol58/test
[ "region:us" ]
2023-09-12T07:15:38+00:00
{}
2023-09-12T07:18:04+00:00
[]
[]
TAGS #region-us
This dataset is a subset of the Open Assistant dataset, which you can find here: URL This subset of the data only contains the highest-rated paths in the conversation tree, with a total of 9,846 samples. This dataset was used to train Guanaco with QLoRA. For further information, please see the original dataset. License: Apache 2.0
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
eb6d528db7a2e1c866e59f0c3d3262ce529aaaaf
# Dataset of maekawa_miku/前川みく/마에카와미쿠 (THE iDOLM@STER: Cinderella Girls) This is the dataset of maekawa_miku/前川みく/마에카와미쿠 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are `brown_hair, green_eyes, short_hair, breasts, animal_ears, cat_ears, fang, bangs, medium_breasts, tail, cat_tail`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 685.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maekawa_miku_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 377.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maekawa_miku_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1274 | 839.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maekawa_miku_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 600.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maekawa_miku_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1274 | 1.19 GiB | [Download](https://huggingface.co/datasets/CyberHarem/maekawa_miku_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/maekawa_miku_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, looking_at_viewer, school_uniform, solo, white_shirt, collared_shirt, pleated_skirt, red_bowtie, short_sleeves, sweater_vest, white_background, paw_pose, simple_background, :d, black_thighhighs, grey_skirt, open_mouth | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, open_mouth, solo, looking_at_viewer, smile, cleavage, paw_pose | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, open_mouth, ponytail, solo, paw_pose, smile, thighhighs, character_name, ribbon | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, looking_at_viewer, solo, blush, collarbone, simple_background, white_background, dress, hair_flower, paw_pose, long_sleeves, open_mouth, red_flower, :d, dated, one_eye_closed, rose | | 4 | 17 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | looking_at_viewer, 1girl, blush, enmaided, maid_headdress, solo, wrist_cuffs, waist_apron, cleavage, open_mouth, white_apron, black_dress, black_ribbon, simple_background, maid_apron, puffy_short_sleeves, white_background, :d, detached_collar, white_thighhighs, neck_ribbon, frilled_apron, large_breasts, heart, paw_pose | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, blush, collarbone, looking_at_viewer, open_mouth, solo, white_background, large_breasts, simple_background, cleavage, micro_bikini, navel, smile, thighs, bare_shoulders, elbow_gloves, black_bikini, fake_animal_ears, paw_gloves, side-tie_bikini_bottom, squatting, white_bikini, white_choker, white_gloves, white_thighhighs | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, hair_bow, cleavage, earrings, looking_at_viewer, pink_bow, pink_dress, solo, bare_shoulders, blush, white_background, gloves, :d, detached_sleeves, open_mouth, ribbon, simple_background, :3, choker, frilled_dress, heart_tail, short_sleeves | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, bare_shoulders, blush, cleavage, detached_collar, detached_sleeves, floral_print, hair_ornament, looking_at_viewer, ponytail, solo, bow, frilled_sleeves, red_ribbon, strapless_dress, wide_sleeves, frilled_dress, hair_ribbon, pink_dress, simple_background, smile, tail_ornament, white_background, frilled_collar, jingle_bell, large_breasts, open_mouth, paw_pose, sitting | | 8 | 17 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, solo, cleavage, blush, looking_at_viewer, collarbone, large_breasts, navel, smile, bare_shoulders, open_mouth, red_bikini, day, outdoors, hair_flower, plaid_bikini, blue_sky, bikini_skirt, cloud, necklace, ocean, water, beach, frills | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, blush, cleavage, looking_at_viewer, simple_background, solo, bare_shoulders, navel, open_mouth, paw_pose, ponytail, white_background, collarbone, underwear_only, :3, :d, pink_bra, side-tie_panties | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | school_uniform | solo | white_shirt | collared_shirt | pleated_skirt | red_bowtie | short_sleeves | sweater_vest | white_background | paw_pose | simple_background | :d | black_thighhighs | grey_skirt | open_mouth | smile | cleavage | ponytail | thighhighs | character_name | ribbon | collarbone | dress | hair_flower | long_sleeves | red_flower | dated | one_eye_closed | rose | enmaided | maid_headdress | wrist_cuffs | waist_apron | white_apron | black_dress | black_ribbon | maid_apron | puffy_short_sleeves | detached_collar | white_thighhighs | neck_ribbon | frilled_apron | large_breasts | heart | micro_bikini | navel | thighs | bare_shoulders | elbow_gloves | black_bikini | fake_animal_ears | paw_gloves | side-tie_bikini_bottom | squatting | white_bikini | white_choker | white_gloves | hair_bow | earrings | pink_bow | pink_dress | gloves | detached_sleeves | :3 | choker | frilled_dress | heart_tail | floral_print | hair_ornament | bow | frilled_sleeves | red_ribbon | strapless_dress | wide_sleeves | hair_ribbon | tail_ornament | frilled_collar | jingle_bell | sitting | red_bikini | day | outdoors | plaid_bikini | blue_sky | bikini_skirt | cloud | necklace | ocean | water | beach | frills | underwear_only | pink_bra | side-tie_panties | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-----------------|:-------|:--------------|:-----------------|:----------------|:-------------|:----------------|:---------------|:-------------------|:-----------|:--------------------|:-----|:-------------------|:-------------|:-------------|:--------|:-----------|:-----------|:-------------|:-----------------|:---------|:-------------|:--------|:--------------|:---------------|:-------------|:--------|:-----------------|:-------|:-----------|:-----------------|:--------------|:--------------|:--------------|:--------------|:---------------|:-------------|:----------------------|:------------------|:-------------------|:--------------|:----------------|:----------------|:--------|:---------------|:--------|:---------|:-----------------|:---------------|:---------------|:-------------------|:-------------|:-------------------------|:------------|:---------------|:---------------|:---------------|:-----------|:-----------|:-----------|:-------------|:---------|:-------------------|:-----|:---------|:----------------|:-------------|:---------------|:----------------|:------|:------------------|:-------------|:------------------|:---------------|:--------------|:----------------|:-----------------|:--------------|:----------|:-------------|:------|:-----------|:---------------|:-----------|:---------------|:--------|:-----------|:--------|:--------|:--------|:---------|:-----------------|:-----------|:-------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | | | | | | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 9 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | X | | | | | | | | X | | | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | X | | | | | | | X | X | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 17 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | | X | | | | | | | X | X | X | X | | | X | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 8 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | | X | | | | | | | X | | X | | | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 8 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | | X | | | | | X | | X | | X | X | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | | X | | | | | | | X | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | X | | | | | | | | | | | | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | 8 | 17 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | X | | X | | | | | | | | | | | | | X | X | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | X | | X | | | | | | | X | X | X | X | | | X | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X |
CyberHarem/maekawa_miku_idolmastercinderellagirls
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T07:16:51+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T10:03:07+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of maekawa\_miku/前川みく/마에카와미쿠 (THE iDOLM@STER: Cinderella Girls) ======================================================================= This is the dataset of maekawa\_miku/前川みく/마에카와미쿠 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are 'brown\_hair, green\_eyes, short\_hair, breasts, animal\_ears, cat\_ears, fang, bangs, medium\_breasts, tail, cat\_tail', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
a1ff0bbee9ed12b787de07aa2e8b17b7ab37e084
# Dataset Card for "squad_instruction_v1_train_100" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_instruction_v1_train_100
[ "region:us" ]
2023-09-12T07:20:05+00:00
{"dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int64"}, {"name": "text", "sequence": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 177041.73335312048, "num_examples": 100}, {"name": "validation", "num_bytes": 1888548.7582781457, "num_examples": 1000}], "download_size": 1184787, "dataset_size": 2065590.4916312662}}
2023-09-12T16:04:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_instruction_v1_train_100" More Information needed
[ "# Dataset Card for \"squad_instruction_v1_train_100\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_instruction_v1_train_100\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_instruction_v1_train_100\"\n\nMore Information needed" ]
40c8b009c98469620fba93c9e4e797b3d7a5e1cc
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-07T01:08:34.943024](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096_public/blob/main/results_2023-11-07T01-08-34.943024.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0041946308724832215, "em_stderr": 0.0006618716168266773, "f1": 0.05325922818791949, "f1_stderr": 0.0013955852580394643, "acc": 0.26400947119179163, "acc_stderr": 0.007015202106702891 }, "harness|drop|3": { "em": 0.0041946308724832215, "em_stderr": 0.0006618716168266773, "f1": 0.05325922818791949, "f1_stderr": 0.0013955852580394643 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 }, "harness|winogrande|5": { "acc": 0.5280189423835833, "acc_stderr": 0.014030404213405782 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096
[ "region:us" ]
2023-09-12T07:25:19+00:00
{"pretty_name": "Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-07T01:08:34.943024](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096_public/blob/main/results_2023-11-07T01-08-34.943024.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0041946308724832215,\n \"em_stderr\": 0.0006618716168266773,\n \"f1\": 0.05325922818791949,\n \"f1_stderr\": 0.0013955852580394643,\n \"acc\": 0.26400947119179163,\n \"acc_stderr\": 0.007015202106702891\n },\n \"harness|drop|3\": {\n \"em\": 0.0041946308724832215,\n \"em_stderr\": 0.0006618716168266773,\n \"f1\": 0.05325922818791949,\n \"f1_stderr\": 0.0013955852580394643\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5280189423835833,\n \"acc_stderr\": 0.014030404213405782\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_05T07_56_49.071994", "path": ["**/details_harness|drop|3_2023-11-05T07-56-49.071994.parquet"]}, {"split": "2023_11_07T01_08_34.943024", "path": ["**/details_harness|drop|3_2023-11-07T01-08-34.943024.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-07T01-08-34.943024.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_05T07_56_49.071994", "path": ["**/details_harness|gsm8k|5_2023-11-05T07-56-49.071994.parquet"]}, {"split": "2023_11_07T01_08_34.943024", "path": ["**/details_harness|gsm8k|5_2023-11-07T01-08-34.943024.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-07T01-08-34.943024.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_05T07_56_49.071994", "path": ["**/details_harness|winogrande|5_2023-11-05T07-56-49.071994.parquet"]}, {"split": "2023_11_07T01_08_34.943024", "path": ["**/details_harness|winogrande|5_2023-11-07T01-08-34.943024.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-07T01-08-34.943024.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_05T07_56_49.071994", "path": ["results_2023-11-05T07-56-49.071994.parquet"]}, {"split": "2023_11_07T01_08_34.943024", "path": ["results_2023-11-07T01-08-34.943024.parquet"]}, {"split": "latest", "path": ["results_2023-11-07T01-08-34.943024.parquet"]}]}]}
2023-12-01T14:27:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096 on the Open LLM Leaderboard. The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-11-07T01:08:34.943024(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-07T01:08:34.943024(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-07T01:08:34.943024(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 40, 31, 189, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-1B5-20230520-2942-486Gtokens-ctx4096 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-07T01:08:34.943024(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c6175027a236c257a55c35f76f503a5efa764e98
# Dataset Card for "processed_bert_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tmon546596046/processed_bert_dataset
[ "region:us" ]
2023-09-12T07:32:40+00:00
{"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "token_type_ids", "sequence": "int8"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "special_tokens_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 67615200.0, "num_examples": 18782}], "download_size": 16390157, "dataset_size": 67615200.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-12T07:32:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for "processed_bert_dataset" More Information needed
[ "# Dataset Card for \"processed_bert_dataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"processed_bert_dataset\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"processed_bert_dataset\"\n\nMore Information needed" ]
bb7f9c1e4bddf084fe222522dd94cb4912301b6d
# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-1.2-L2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Sao10K/Stheno-Inverted-1.2-L2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Sao10K/Stheno-Inverted-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-Inverted-1.2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T18:34:04.401753](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B/blob/main/results_2023-10-23T18-34-04.401753.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.006082214765100671, "em_stderr": 0.000796243239302896, "f1": 0.08614303691275203, "f1_stderr": 0.001761867125581607, "acc": 0.4180536664965267, "acc_stderr": 0.010042668742086668 }, "harness|drop|3": { "em": 0.006082214765100671, "em_stderr": 0.000796243239302896, "f1": 0.08614303691275203, "f1_stderr": 0.001761867125581607 }, "harness|gsm8k|5": { "acc": 0.08946171341925702, "acc_stderr": 0.007861583049939721 }, "harness|winogrande|5": { "acc": 0.7466456195737964, "acc_stderr": 0.012223754434233614 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B
[ "region:us" ]
2023-09-12T07:39:26+00:00
{"pretty_name": "Evaluation run of Sao10K/Stheno-Inverted-1.2-L2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Stheno-Inverted-1.2-L2-13B](https://huggingface.co/Sao10K/Stheno-Inverted-1.2-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T18:34:04.401753](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Inverted-1.2-L2-13B/blob/main/results_2023-10-23T18-34-04.401753.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006082214765100671,\n \"em_stderr\": 0.000796243239302896,\n \"f1\": 0.08614303691275203,\n \"f1_stderr\": 0.001761867125581607,\n \"acc\": 0.4180536664965267,\n \"acc_stderr\": 0.010042668742086668\n },\n \"harness|drop|3\": {\n \"em\": 0.006082214765100671,\n \"em_stderr\": 0.000796243239302896,\n \"f1\": 0.08614303691275203,\n \"f1_stderr\": 0.001761867125581607\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08946171341925702,\n \"acc_stderr\": 0.007861583049939721\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233614\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Stheno-Inverted-1.2-L2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|arc:challenge|25_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T18_34_04.401753", "path": ["**/details_harness|drop|3_2023-10-23T18-34-04.401753.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T18-34-04.401753.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T18_34_04.401753", "path": ["**/details_harness|gsm8k|5_2023-10-23T18-34-04.401753.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T18-34-04.401753.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hellaswag|10_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T08-39-09.830541.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T08-39-09.830541.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T08-39-09.830541.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T18_34_04.401753", "path": ["**/details_harness|winogrande|5_2023-10-23T18-34-04.401753.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T18-34-04.401753.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T08_39_09.830541", "path": ["results_2023-09-12T08-39-09.830541.parquet"]}, {"split": "2023_10_23T18_34_04.401753", "path": ["results_2023-10-23T18-34-04.401753.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T18-34-04.401753.parquet"]}]}]}
2023-10-23T17:34:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-1.2-L2-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Sao10K/Stheno-Inverted-1.2-L2-13B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T18:34:04.401753(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-1.2-L2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-Inverted-1.2-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T18:34:04.401753(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-1.2-L2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-Inverted-1.2-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T18:34:04.401753(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 27, 31, 175, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-1.2-L2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-Inverted-1.2-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T18:34:04.401753(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
18f9459f15f03a622e950f4cd3d065b83a8b2a1e
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-27T05:08:42.150075](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098/blob/main/results_2023-10-27T05-08-42.150075.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.002726510067114094, "em_stderr": 0.0005340111700415923, "f1": 0.05103817114093967, "f1_stderr": 0.0013185598027679899, "acc": 0.26317353140578503, "acc_stderr": 0.007674173740131278 }, "harness|drop|3": { "em": 0.002726510067114094, "em_stderr": 0.0005340111700415923, "f1": 0.05103817114093967, "f1_stderr": 0.0013185598027679899 }, "harness|gsm8k|5": { "acc": 0.002274450341167551, "acc_stderr": 0.0013121578148674266 }, "harness|winogrande|5": { "acc": 0.5240726124704025, "acc_stderr": 0.014036189665395129 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098
[ "region:us" ]
2023-09-12T07:48:04+00:00
{"pretty_name": "Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098](https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T05:08:42.150075](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098/blob/main/results_2023-10-27T05-08-42.150075.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415923,\n \"f1\": 0.05103817114093967,\n \"f1_stderr\": 0.0013185598027679899,\n \"acc\": 0.26317353140578503,\n \"acc_stderr\": 0.007674173740131278\n },\n \"harness|drop|3\": {\n \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415923,\n \"f1\": 0.05103817114093967,\n \"f1_stderr\": 0.0013185598027679899\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.0013121578148674266\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5240726124704025,\n \"acc_stderr\": 0.014036189665395129\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|arc:challenge|25_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T05_08_42.150075", "path": ["**/details_harness|drop|3_2023-10-27T05-08-42.150075.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T05-08-42.150075.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T05_08_42.150075", "path": ["**/details_harness|gsm8k|5_2023-10-27T05-08-42.150075.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T05-08-42.150075.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hellaswag|10_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T08-47-54.050773.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T08-47-54.050773.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T08-47-54.050773.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T05_08_42.150075", "path": ["**/details_harness|winogrande|5_2023-10-27T05-08-42.150075.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T05-08-42.150075.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T08_47_54.050773", "path": ["results_2023-09-12T08-47-54.050773.parquet"]}, {"split": "2023_10_27T05_08_42.150075", "path": ["results_2023-10-27T05-08-42.150075.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T05-08-42.150075.parquet"]}]}]}
2023-10-27T04:08:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-27T05:08:42.150075(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T05:08:42.150075(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-27T05:08:42.150075(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 41, 31, 189, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/RWKV-4-PilePlus-430M-20230520-6162-1018Gtokens-ctx4098 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T05:08:42.150075(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
716a9e93784420715fc34ff98489e4ca0a74edac
Creating high quality synthetic Datasets: - [x] Python Textbook with hands-on experience and Code-Exercises -> 42,491 words 285,786 characters - [x] Test-Driven development with Python -> 66,126 words 478,070 characters - [x] Torch in Python Textbook with hands-on experience and Code-Exercises -> 60,149 words 473,343 characters Todo: - [ ] [*programming language*] hands-on experience and Code-Exercises - [ ] Test-driven development with [*programming language*] hands-on experience and Code-Exercises - [ ] [*special lib*] with [*programming language*] Textbook with hands-on experience and Code-Exercises Ideas: - programming languages: Javascript, Java, C, C++, C#, GO, (HTML/CSS), SQL, Typescript, (Bash/Shell), PHP, Rust, Kotlin, Ruby - special lib: - HTML/CSS: Bootstrap, Tailwind, - Python: Torch, Tensorflow, Mlflow, FastAPI, Flask, - Javascript/Typescript: Angular, React,
SebastianBodza/TextbooksAreAllYouNeed
[ "region:us" ]
2023-09-12T07:51:49+00:00
{}
2023-09-13T11:43:12+00:00
[]
[]
TAGS #region-us
Creating high quality synthetic Datasets: - [x] Python Textbook with hands-on experience and Code-Exercises -> 42,491 words 285,786 characters - [x] Test-Driven development with Python -> 66,126 words 478,070 characters - [x] Torch in Python Textbook with hands-on experience and Code-Exercises -> 60,149 words 473,343 characters Todo: - [ ] [*programming language*] hands-on experience and Code-Exercises - [ ] Test-driven development with [*programming language*] hands-on experience and Code-Exercises - [ ] [*special lib*] with [*programming language*] Textbook with hands-on experience and Code-Exercises Ideas: - programming languages: Javascript, Java, C, C++, C#, GO, (HTML/CSS), SQL, Typescript, (Bash/Shell), PHP, Rust, Kotlin, Ruby - special lib: - HTML/CSS: Bootstrap, Tailwind, - Python: Torch, Tensorflow, Mlflow, FastAPI, Flask, - Javascript/Typescript: Angular, React,
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
44bab97124ccd0ac90ec5173b4375825eee2b0fc
# Dataset Card for "madras_dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
rikdas/madras_dataset
[ "region:us" ]
2023-09-12T07:53:18+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 22751754.0, "num_examples": 10}], "download_size": 22753302, "dataset_size": 22751754.0}}
2023-09-12T07:53:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "madras_dataset" More Information needed
[ "# Dataset Card for \"madras_dataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"madras_dataset\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"madras_dataset\"\n\nMore Information needed" ]
5d6fe7d8d1daa2eb4183c413f77a7e67a5223203
# Dataset of chloe_von_einzbern/クロエ・フォン・アインツベルン/克洛伊·冯·爱因兹贝伦 (Fate/Grand Order) This is the dataset of chloe_von_einzbern/クロエ・フォン・アインツベルン/克洛伊·冯·爱因兹贝伦 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are `dark-skinned_female, dark_skin, long_hair, pink_hair, breasts, small_breasts, orange_eyes, bangs, hair_between_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 584.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_von_einzbern_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 357.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_von_einzbern_fgo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1207 | 758.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_von_einzbern_fgo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 528.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_von_einzbern_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1207 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_von_einzbern_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/chloe_von_einzbern_fgo', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | blush, loli, navel, 1girl, nipples, open_mouth, stomach_tattoo, 1boy, hetero, nude, penis, looking_at_viewer, sex, smile, thighs, vaginal, sidelocks, armpits, half_updo, spread_legs, hairpin, mosaic_censoring, cum_in_pussy, fake_animal_ears, long_sleeves, one_side_up, rabbit_ears, reverse_bunnysuit, shrug_(clothing) | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blush, loli, looking_at_viewer, navel, nipples, pussy, smile, solo, censored, collarbone, sidelocks, spread_legs, thighs, anus, ass, completely_nude, open_mouth, one_side_up, sweat, after_sex, cum, lying | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, simple_background, white_background, hairpin, open_mouth, sidelocks, sweat, half_updo, loli, looking_at_viewer, navel, single_hair_bun, stomach_tattoo, thighs, 1boy, collarbone, nipples, solo_focus, ass, bare_shoulders, hetero, smile, black_bikini, nude, sex_from_behind, tongue_out | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, long_sleeves, looking_at_viewer, navel, waist_cape, blush, breastplate, bridal_gauntlets, shrug_(clothing), solo, stomach_tattoo, half_updo, red_cape, smile, black_panties, white_background, hair_bun, hairpin, black_footwear, heart, knee_boots, sidelocks, simple_background, thighs | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, dual_wielding, holding_sword, looking_at_viewer, solo, black_footwear, black_panties, bra, knee_boots, long_sleeves, navel, waist_cape, tattoo, full_body, midriff, smile, standing | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, bare_shoulders, blush, cosplay, covered_navel, highleg_leotard, spread_legs, stomach_tattoo, taimanin_suit, hairpin, loli, open_mouth, sidelocks, solo, covered_nipples, elbow_gloves, fishnets, pussy, sweat, bar_censor, black_leotard, black_thighhighs, half_updo, single_hair_bun, smile | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | navel, 2girls, blush, looking_at_viewer, solo_focus, armpits, black_bikini, collarbone, grin, micro_bikini, one_side_up, stomach_tattoo, arm_behind_head, arm_up, covered_nipples, side-tie_bikini_bottom, white_background, yellow_eyes | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | blush, long_sleeves, obi, 1girl, floral_print, sidelocks, smile, solo, looking_at_viewer, red_kimono, open_mouth, wide_sleeves | | 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, looking_at_viewer, solo, one_side_up, smile, yellow_eyes, open_mouth | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, black_skirt, blush, full_body, homurahara_academy_school_uniform, looking_at_viewer, one_side_up, pleated_skirt, solo, kneehighs, simple_background, smile, white_background, black_socks, brown_footwear, closed_mouth, loafers, puffy_short_sleeves, white_shirt, brown_shirt, collarbone, hand_on_hip, long_sleeves, neck_ribbon, open_mouth, red_ribbon | | 10 | 13 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, blush, hair_bow, jingle_bell, looking_at_viewer, paw_gloves, red_bow, smile, bare_shoulders, cat_ears, open_mouth, solo, black_skirt, ponytail, sidelocks, navel, o-ring, animal_ear_fluff, cat_tail, collar, fur-trimmed_skirt, stomach_tattoo, thighs, white_background, black_thighhighs, hair_ornament, simple_background, thigh_strap | | 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, blush, classroom, looking_at_viewer, smile, solo, thighs, indoors, sidelocks, white_shirt, black_panties, chalkboard, crop_top, hair_ornament, navel, sailor_collar, serafuku, short_sleeves, black_skirt, brown_pantyhose, closed_mouth, long_sleeves, looking_back, neckerchief, pleated_skirt, school_desk, yellow_eyes | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | loli | navel | 1girl | nipples | open_mouth | stomach_tattoo | 1boy | hetero | nude | penis | looking_at_viewer | sex | smile | thighs | vaginal | sidelocks | armpits | half_updo | spread_legs | hairpin | mosaic_censoring | cum_in_pussy | fake_animal_ears | long_sleeves | one_side_up | rabbit_ears | reverse_bunnysuit | shrug_(clothing) | pussy | solo | censored | collarbone | anus | ass | completely_nude | sweat | after_sex | cum | lying | simple_background | white_background | single_hair_bun | solo_focus | bare_shoulders | black_bikini | sex_from_behind | tongue_out | waist_cape | breastplate | bridal_gauntlets | red_cape | black_panties | hair_bun | black_footwear | heart | knee_boots | dual_wielding | holding_sword | bra | tattoo | full_body | midriff | standing | cosplay | covered_navel | highleg_leotard | taimanin_suit | covered_nipples | elbow_gloves | fishnets | bar_censor | black_leotard | black_thighhighs | 2girls | grin | micro_bikini | arm_behind_head | arm_up | side-tie_bikini_bottom | yellow_eyes | obi | floral_print | red_kimono | wide_sleeves | black_skirt | homurahara_academy_school_uniform | pleated_skirt | kneehighs | black_socks | brown_footwear | closed_mouth | loafers | puffy_short_sleeves | white_shirt | brown_shirt | hand_on_hip | neck_ribbon | red_ribbon | hair_bow | jingle_bell | paw_gloves | red_bow | cat_ears | ponytail | o-ring | animal_ear_fluff | cat_tail | collar | fur-trimmed_skirt | hair_ornament | thigh_strap | classroom | indoors | chalkboard | crop_top | sailor_collar | serafuku | short_sleeves | brown_pantyhose | looking_back | neckerchief | school_desk | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------|:--------|:----------|:-------------|:-----------------|:-------|:---------|:-------|:--------|:--------------------|:------|:--------|:---------|:----------|:------------|:----------|:------------|:--------------|:----------|:-------------------|:---------------|:-------------------|:---------------|:--------------|:--------------|:--------------------|:-------------------|:--------|:-------|:-----------|:-------------|:-------|:------|:------------------|:--------|:------------|:------|:--------|:--------------------|:-------------------|:------------------|:-------------|:-----------------|:---------------|:------------------|:-------------|:-------------|:--------------|:-------------------|:-----------|:----------------|:-----------|:-----------------|:--------|:-------------|:----------------|:----------------|:------|:---------|:------------|:----------|:-----------|:----------|:----------------|:------------------|:----------------|:------------------|:---------------|:-----------|:-------------|:----------------|:-------------------|:---------|:-------|:---------------|:------------------|:---------|:-------------------------|:--------------|:------|:---------------|:-------------|:---------------|:--------------|:------------------------------------|:----------------|:------------|:--------------|:-----------------|:---------------|:----------|:----------------------|:--------------|:--------------|:--------------|:--------------|:-------------|:-----------|:--------------|:-------------|:----------|:-----------|:-----------|:---------|:-------------------|:-----------|:---------|:--------------------|:----------------|:--------------|:------------|:----------|:-------------|:-----------|:----------------|:-----------|:----------------|:------------------|:---------------|:--------------|:--------------| | 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | | | | | | X | | X | X | | X | | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | X | X | X | X | | X | | X | X | | X | | X | | X | | | | | | | | | | | | X | | X | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 9 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | | | X | | | | | X | | X | X | | X | | X | | X | | | | X | | | | X | | X | | | | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 5 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | | | X | X | | | | | | | | X | | X | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | X | | X | X | | | | | | | X | | | X | | X | X | X | | | | | | | | | X | X | | | | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 5 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | | | X | | | | | X | | | | | | X | | | | | | | | X | | | | | | | X | | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | X | | X | | | | | | X | | X | | | X | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 9 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | X | | X | | | | | | X | | X | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | X | | X | | | | | | X | | X | | | | | | | | | | | X | X | | | | | X | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 13 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | X | X | | X | X | | | | | X | | X | X | | X | | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | X | X | | | | | | | | X | | X | X | | X | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | X | | | | X | | | X | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/chloe_von_einzbern_fgo
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T08:07:53+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-12T12:43:39+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of chloe\_von\_einzbern/クロエ・フォン・アインツベルン/克洛伊·冯·爱因兹贝伦 (Fate/Grand Order) ============================================================================== This is the dataset of chloe\_von\_einzbern/クロエ・フォン・アインツベルン/克洛伊·冯·爱因兹贝伦 (Fate/Grand Order), containing 500 images and their tags. The core tags of this character are 'dark-skinned\_female, dark\_skin, long\_hair, pink\_hair, breasts, small\_breasts, orange\_eyes, bangs, hair\_between\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
3e23fec579d7308f35b031a9f6a33cfcf13aa0e5
# Dataset of mikan/ミカン (Pokémon) This is the dataset of mikan/ミカン (Pokémon), containing 492 images and their tags. The core tags of this character are `brown_hair, long_hair, two_side_up, hair_ornament, brown_eyes, breasts, bow`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 492 | 384.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikan_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 492 | 261.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikan_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 953 | 481.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikan_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 492 | 354.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikan_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 953 | 617.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mikan_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/mikan_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1boy, 1girl, hetero, penis, blush, nipples, pussy, solo_focus, hair_bobbles, sex, spread_legs, collarbone, navel, open_mouth, sweat, vaginal, eyelashes, veins, completely_nude, mosaic_censoring, shiny_skin, small_breasts | | 1 | 32 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, nipples, hetero, hair_bobbles, pokemon_(creature), sex, pokephilia, blush, bestiality, open_mouth, interspecies, small_breasts, stomach_bulge, penis, vaginal, rolling_eyes, uncensored, barefoot, navel, pussy, raised_eyebrows, collarbone, eyelashes, spread_legs, teeth, toes, ahegao, clitoris, large_insertion, completely_nude, tongue_out | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, navel, nipples, solo, hair_bobbles, medium_breasts, nude, looking_at_viewer, simple_background, smile, collarbone, standing, sweat | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, dress, hair_bobbles, looking_at_viewer, pokemon_(creature), blush, closed_mouth, collarbone, orange_bow, smile, upper_body, yellow_eyes | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, collarbone, hair_bobbles, simple_background, sleeveless_dress, white_background, white_dress, bare_shoulders, closed_mouth, flat_chest, forehead, jaggy_lines, smile, solo, happy, purple_eyes, split_mouth, upper_body, looking_at_viewer, no_bra, straight-on | | 5 | 13 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, hair_bobbles, pokemon_(creature), open_mouth, white_dress, :d, sandals, sitting | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, hair_bobbles, smile, solo, standing, orange_bow, sandals, looking_at_viewer, simple_background, toes, white_background, closed_mouth, collarbone, full_body, green_dress, eyelashes, knees, sleeves_past_elbows | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, green_dress, hair_bobbles, pokemon_(creature), sleeves_past_elbows, eyelashes, floating_hair, orange_bow, collarbone, looking_at_viewer, smile, blush, clenched_hand, closed_mouth, hand_up, standing | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, dress, holding_poke_ball, poke_ball_(basic), solo, looking_at_viewer, blush, hair_bobbles, orange_bow | | 9 | 10 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, dress, blush, solo | | 10 | 15 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | hat, official_alternate_costume, 1girl, eyelashes, white_headwear, red_gloves, sleeveless_dress, open_mouth, tongue, white_dress, blush, looking_at_viewer, pokemon_(creature), :d, christmas, buttons | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | penis | blush | nipples | pussy | solo_focus | hair_bobbles | sex | spread_legs | collarbone | navel | open_mouth | sweat | vaginal | eyelashes | veins | completely_nude | mosaic_censoring | shiny_skin | small_breasts | pokemon_(creature) | pokephilia | bestiality | interspecies | stomach_bulge | rolling_eyes | uncensored | barefoot | raised_eyebrows | teeth | toes | ahegao | clitoris | large_insertion | tongue_out | solo | medium_breasts | nude | looking_at_viewer | simple_background | smile | standing | dress | closed_mouth | orange_bow | upper_body | yellow_eyes | sleeveless_dress | white_background | white_dress | bare_shoulders | flat_chest | forehead | jaggy_lines | happy | purple_eyes | split_mouth | no_bra | straight-on | :d | sandals | sitting | full_body | green_dress | knees | sleeves_past_elbows | floating_hair | clenched_hand | hand_up | holding_poke_ball | poke_ball_(basic) | hat | official_alternate_costume | white_headwear | red_gloves | tongue | christmas | buttons | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------|:--------|:---------|:--------|:--------|:----------|:--------|:-------------|:---------------|:------|:--------------|:-------------|:--------|:-------------|:--------|:----------|:------------|:--------|:------------------|:-------------------|:-------------|:----------------|:---------------------|:-------------|:-------------|:---------------|:----------------|:---------------|:-------------|:-----------|:------------------|:--------|:-------|:---------|:-----------|:------------------|:-------------|:-------|:-----------------|:-------|:--------------------|:--------------------|:--------|:-----------|:--------|:---------------|:-------------|:-------------|:--------------|:-------------------|:-------------------|:--------------|:-----------------|:-------------|:-----------|:--------------|:--------|:--------------|:--------------|:---------|:--------------|:-----|:----------|:----------|:------------|:--------------|:--------|:----------------------|:----------------|:----------------|:----------|:--------------------|:--------------------|:------|:-----------------------------|:-----------------|:-------------|:---------|:------------|:----------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 32 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | X | X | X | X | X | X | | X | X | X | X | X | X | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | | X | | | X | X | | | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 8 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | | X | | | X | | | | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 6 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | | X | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 5 | 13 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | | X | | | | | | | X | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | 6 | 9 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | | X | | | | | | | X | | | X | | | | | X | | | | | | | | | | | | | | | | X | | | | | X | | | X | X | X | X | | X | X | | | | X | | | | | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | 7 | 7 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | | X | | | X | | | | X | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | | X | X | | X | X | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | | | | | | | | | | | 8 | 6 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | 9 | 10 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 15 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | | X | | | X | | | | | | | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X |
CyberHarem/mikan_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T08:09:40+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T21:31:42+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of mikan/ミカン (Pokémon) ============================== This is the dataset of mikan/ミカン (Pokémon), containing 492 images and their tags. The core tags of this character are 'brown\_hair, long\_hair, two\_side\_up, hair\_ornament, brown\_eyes, breasts, bow', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
c6af9cafdc0979ecf1f621bd16c726569dce580e
# Dataset Card for "turkishReviews-ds-mini" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ahmet1338/turkishReviews-ds-mini
[ "language:tr", "region:us" ]
2023-09-12T08:10:46+00:00
{"language": ["tr"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "review", "dtype": "string"}, {"name": "review_length", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1252876.2642514652, "num_examples": 3378}, {"name": "validation", "num_bytes": 139455.7357485349, "num_examples": 376}], "download_size": 896649, "dataset_size": 1392332}}
2023-10-02T18:23:56+00:00
[]
[ "tr" ]
TAGS #language-Turkish #region-us
# Dataset Card for "turkishReviews-ds-mini" More Information needed
[ "# Dataset Card for \"turkishReviews-ds-mini\"\n\nMore Information needed" ]
[ "TAGS\n#language-Turkish #region-us \n", "# Dataset Card for \"turkishReviews-ds-mini\"\n\nMore Information needed" ]
[ 12, 19 ]
[ "passage: TAGS\n#language-Turkish #region-us \n# Dataset Card for \"turkishReviews-ds-mini\"\n\nMore Information needed" ]
2fc65241d7f67417857b3536360a60ba850680aa
# Dataset Card for "shp-generated_flan_t5_large_external_rm1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dongyoung4091/shp-generated_flan_t5_large_external_rm1
[ "region:us" ]
2023-09-12T08:12:19+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "external_rm1", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 27036265, "num_examples": 25600}], "download_size": 1847255, "dataset_size": 27036265}}
2023-09-12T08:13:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for "shp-generated_flan_t5_large_external_rm1" More Information needed
[ "# Dataset Card for \"shp-generated_flan_t5_large_external_rm1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"shp-generated_flan_t5_large_external_rm1\"\n\nMore Information needed" ]
[ 6, 31 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"shp-generated_flan_t5_large_external_rm1\"\n\nMore Information needed" ]
1963e34d65174305c837bb1bd2ec7ebafb7b1df4
# Dataset Card for "BeboUpdated" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
BEBO-DBIndia/BeboUpdated
[ "region:us" ]
2023-09-12T08:32:02+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2799, "num_examples": 9}], "download_size": 2821, "dataset_size": 2799}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-09-12T08:37:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for "BeboUpdated" More Information needed
[ "# Dataset Card for \"BeboUpdated\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"BeboUpdated\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"BeboUpdated\"\n\nMore Information needed" ]
fdd875691f8111ba22dc08193403235537b1bb28
# Dataset Card for Structures Damaged by Wildfire **Homepage:** [Image Dataset of Structures Damaged by Wildfire in California 2020-2022](https://zenodo.org/record/8336570) ### Dataset Summary The dataset contains over 18,000 images of homes damaged by wildfire between 2020 and 2022 in California, USA, captured by the California Department of Forestry and Fire Protection (Cal Fire) during the damage assessment process. The dataset spans across more than 18 wildfire events, including the 2020 August Complex Fire, the first recorded "gigafire" event in California where the area burned exceeded 1 million acres. Each image, corresponding to a built structure, is classified by government damage assessors into 6 different categories: Inaccessible (image taken but no assessment made), No Damage, Affected (1-9%), Minor (10-25%), Major (26-50%), and Destroyed (>50%). While over 57,000 structures were evaluated during the damage assessment process, only about 18,000 contains images; additional data about the structures, such as the street address or structure materials, for both those with and without corresponding images can be accessed in the "Additional Attribute Data" file. The 18 wildfire events captured in the dataset are: - [AUG] August Complex (2020) - [BEA] Bear Fire (2020) - [BEU] BEU Lightning Complex Fire (2020) - [CAL] Caldor Fire (2021) - [CAS] Castle Fire (2020) - [CRE] Creek Fire (2020) - [DIN] DINS Statewide (Collection of Smaller Fires, 2021) - [DIX[ Dixie Fire (2021) - [FAI] Fairview Fire (2022) - [FOR] Fork Fire (2022) - [GLA] Glass Fire (2020) - [MIL] Mill Mountain Fire (2022) - [MON] Monument Fire (2021) - [MOS] Mosquito Fire (2022) - [POST] Post Fire (2020) - [SCU] SCU Complex Fire (2020) - [VAL] Valley Fire (2020) - [ZOG] Zogg Fire (2020) The author retrieved the data, originally published as GIS features layers, from from the publicly accessible CAL FIRE Hub, then subsequently processed it into image and tabular formats. The author collaborated with Cal Fire in working with the data, and has received explicit permission for republication. ### Data Fields The data instances have the following fields: - `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`. - `labels`: an `int` classification label. Class Label Mappings: ``` { "affected": 0, "destroyed": 1, "inaccessible": 2, "major": 3, "minor": 4, "no_damage": 5, } ``` ### Data Splits | | train | |---------------|------:| | # of examples | 18,714 |
kevincluo/structure_wildfire_damage_classification
[ "language:en", "license:cc-by-4.0", "climate", "wildfire", "image classification", "damage assessment", "region:us" ]
2023-09-12T08:46:20+00:00
{"language": ["en"], "license": "cc-by-4.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "affected", "1": "destroyed", "2": "inaccessible", "3": "major", "4": "minor", "5": "no_damage"}}}}], "splits": [{"name": "train", "num_bytes": 125229532, "num_examples": 355}], "download_size": 125234000, "dataset_size": 125229532}, "tags": ["climate", "wildfire", "image classification", "damage assessment"]}
2023-09-13T23:11:33+00:00
[]
[ "en" ]
TAGS #language-English #license-cc-by-4.0 #climate #wildfire #image classification #damage assessment #region-us
Dataset Card for Structures Damaged by Wildfire =============================================== Homepage: Image Dataset of Structures Damaged by Wildfire in California 2020-2022 ### Dataset Summary The dataset contains over 18,000 images of homes damaged by wildfire between 2020 and 2022 in California, USA, captured by the California Department of Forestry and Fire Protection (Cal Fire) during the damage assessment process. The dataset spans across more than 18 wildfire events, including the 2020 August Complex Fire, the first recorded "gigafire" event in California where the area burned exceeded 1 million acres. Each image, corresponding to a built structure, is classified by government damage assessors into 6 different categories: Inaccessible (image taken but no assessment made), No Damage, Affected (1-9%), Minor (10-25%), Major (26-50%), and Destroyed (>50%). While over 57,000 structures were evaluated during the damage assessment process, only about 18,000 contains images; additional data about the structures, such as the street address or structure materials, for both those with and without corresponding images can be accessed in the "Additional Attribute Data" file. The 18 wildfire events captured in the dataset are: * [AUG] August Complex (2020) * [BEA] Bear Fire (2020) * [BEU] BEU Lightning Complex Fire (2020) * [CAL] Caldor Fire (2021) * [CAS] Castle Fire (2020) * [CRE] Creek Fire (2020) * [DIN] DINS Statewide (Collection of Smaller Fires, 2021) * [DIX[ Dixie Fire (2021) * [FAI] Fairview Fire (2022) * [FOR] Fork Fire (2022) * [GLA] Glass Fire (2020) * [MIL] Mill Mountain Fire (2022) * [MON] Monument Fire (2021) * [MOS] Mosquito Fire (2022) * [POST] Post Fire (2020) * [SCU] SCU Complex Fire (2020) * [VAL] Valley Fire (2020) * [ZOG] Zogg Fire (2020) The author retrieved the data, originally published as GIS features layers, from from the publicly accessible CAL FIRE Hub, then subsequently processed it into image and tabular formats. The author collaborated with Cal Fire in working with the data, and has received explicit permission for republication. ### Data Fields The data instances have the following fields: * 'image': A 'PIL.Image.Image' object containing the image. Note that when accessing the image column: 'dataset[0]["image"]' the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the '"image"' column, *i.e.* 'dataset[0]["image"]' should always be preferred over 'dataset["image"][0]'. * 'labels': an 'int' classification label. Class Label Mappings: ### Data Splits
[ "### Dataset Summary\n\n\nThe dataset contains over 18,000 images of homes damaged by wildfire between 2020 and 2022 in California, USA, captured by the California Department of Forestry and Fire Protection (Cal Fire) during the damage assessment process. The dataset spans across more than 18 wildfire events, including the 2020 August Complex Fire, the first recorded \"gigafire\" event in California where the area burned exceeded 1 million acres. Each image, corresponding to a built structure, is classified by government damage assessors into 6 different categories: Inaccessible (image taken but no assessment made), No Damage, Affected (1-9%), Minor (10-25%), Major (26-50%), and Destroyed (>50%). While over 57,000 structures were evaluated during the damage assessment process, only about 18,000 contains images; additional data about the structures, such as the street address or structure materials, for both those with and without corresponding images can be accessed in the \"Additional Attribute Data\" file.\n\n\nThe 18 wildfire events captured in the dataset are:\n\n\n* [AUG] August Complex (2020)\n* [BEA] Bear Fire (2020)\n* [BEU] BEU Lightning Complex Fire (2020)\n* [CAL] Caldor Fire (2021)\n* [CAS] Castle Fire (2020)\n* [CRE] Creek Fire (2020)\n* [DIN] DINS Statewide (Collection of Smaller Fires, 2021)\n* [DIX[ Dixie Fire (2021)\n* [FAI] Fairview Fire (2022)\n* [FOR] Fork Fire (2022)\n* [GLA] Glass Fire (2020)\n* [MIL] Mill Mountain Fire (2022)\n* [MON] Monument Fire (2021)\n* [MOS] Mosquito Fire (2022)\n* [POST] Post Fire (2020)\n* [SCU] SCU Complex Fire (2020)\n* [VAL] Valley Fire (2020)\n* [ZOG] Zogg Fire (2020)\n\n\nThe author retrieved the data, originally published as GIS features layers, from from the publicly accessible CAL FIRE Hub, then subsequently processed it into image and tabular formats. The author collaborated with Cal Fire in working with the data, and has received explicit permission for republication.", "### Data Fields\n\n\nThe data instances have the following fields:\n\n\n* 'image': A 'PIL.Image.Image' object containing the image. Note that when accessing the image column: 'dataset[0][\"image\"]' the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the '\"image\"' column, *i.e.* 'dataset[0][\"image\"]' should always be preferred over 'dataset[\"image\"][0]'.\n* 'labels': an 'int' classification label.\n\n\nClass Label Mappings:", "### Data Splits" ]
[ "TAGS\n#language-English #license-cc-by-4.0 #climate #wildfire #image classification #damage assessment #region-us \n", "### Dataset Summary\n\n\nThe dataset contains over 18,000 images of homes damaged by wildfire between 2020 and 2022 in California, USA, captured by the California Department of Forestry and Fire Protection (Cal Fire) during the damage assessment process. The dataset spans across more than 18 wildfire events, including the 2020 August Complex Fire, the first recorded \"gigafire\" event in California where the area burned exceeded 1 million acres. Each image, corresponding to a built structure, is classified by government damage assessors into 6 different categories: Inaccessible (image taken but no assessment made), No Damage, Affected (1-9%), Minor (10-25%), Major (26-50%), and Destroyed (>50%). While over 57,000 structures were evaluated during the damage assessment process, only about 18,000 contains images; additional data about the structures, such as the street address or structure materials, for both those with and without corresponding images can be accessed in the \"Additional Attribute Data\" file.\n\n\nThe 18 wildfire events captured in the dataset are:\n\n\n* [AUG] August Complex (2020)\n* [BEA] Bear Fire (2020)\n* [BEU] BEU Lightning Complex Fire (2020)\n* [CAL] Caldor Fire (2021)\n* [CAS] Castle Fire (2020)\n* [CRE] Creek Fire (2020)\n* [DIN] DINS Statewide (Collection of Smaller Fires, 2021)\n* [DIX[ Dixie Fire (2021)\n* [FAI] Fairview Fire (2022)\n* [FOR] Fork Fire (2022)\n* [GLA] Glass Fire (2020)\n* [MIL] Mill Mountain Fire (2022)\n* [MON] Monument Fire (2021)\n* [MOS] Mosquito Fire (2022)\n* [POST] Post Fire (2020)\n* [SCU] SCU Complex Fire (2020)\n* [VAL] Valley Fire (2020)\n* [ZOG] Zogg Fire (2020)\n\n\nThe author retrieved the data, originally published as GIS features layers, from from the publicly accessible CAL FIRE Hub, then subsequently processed it into image and tabular formats. The author collaborated with Cal Fire in working with the data, and has received explicit permission for republication.", "### Data Fields\n\n\nThe data instances have the following fields:\n\n\n* 'image': A 'PIL.Image.Image' object containing the image. Note that when accessing the image column: 'dataset[0][\"image\"]' the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the '\"image\"' column, *i.e.* 'dataset[0][\"image\"]' should always be preferred over 'dataset[\"image\"][0]'.\n* 'labels': an 'int' classification label.\n\n\nClass Label Mappings:", "### Data Splits" ]
[ 35, 494, 161, 5 ]
[ "passage: TAGS\n#language-English #license-cc-by-4.0 #climate #wildfire #image classification #damage assessment #region-us \n" ]
9f589310043b5041ecea91361ea4d4505a6abcd6
# Dataset Card for Evaluation run of Sao10K/Medusa-1.1-L2-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Sao10K/Medusa-1.1-L2-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Sao10K/Medusa-1.1-L2-7B](https://huggingface.co/Sao10K/Medusa-1.1-L2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T22:27:14.314386](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B/blob/main/results_2023-10-23T22-27-14.314386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.2837667785234899, "em_stderr": 0.004616870115379374, "f1": 0.3653198406040281, "f1_stderr": 0.004545820875148166, "acc": 0.3824984008238525, "acc_stderr": 0.007721122557033827 }, "harness|drop|3": { "em": 0.2837667785234899, "em_stderr": 0.004616870115379374, "f1": 0.3653198406040281, "f1_stderr": 0.004545820875148166 }, "harness|gsm8k|5": { "acc": 0.014404852160727824, "acc_stderr": 0.003282055917136963 }, "harness|winogrande|5": { "acc": 0.7505919494869772, "acc_stderr": 0.01216018919693069 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B
[ "region:us" ]
2023-09-12T08:52:36+00:00
{"pretty_name": "Evaluation run of Sao10K/Medusa-1.1-L2-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Medusa-1.1-L2-7B](https://huggingface.co/Sao10K/Medusa-1.1-L2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T22:27:14.314386](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-1.1-L2-7B/blob/main/results_2023-10-23T22-27-14.314386.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2837667785234899,\n \"em_stderr\": 0.004616870115379374,\n \"f1\": 0.3653198406040281,\n \"f1_stderr\": 0.004545820875148166,\n \"acc\": 0.3824984008238525,\n \"acc_stderr\": 0.007721122557033827\n },\n \"harness|drop|3\": {\n \"em\": 0.2837667785234899,\n \"em_stderr\": 0.004616870115379374,\n \"f1\": 0.3653198406040281,\n \"f1_stderr\": 0.004545820875148166\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.003282055917136963\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7505919494869772,\n \"acc_stderr\": 0.01216018919693069\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Medusa-1.1-L2-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|arc:challenge|25_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T22_27_14.314386", "path": ["**/details_harness|drop|3_2023-10-23T22-27-14.314386.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T22-27-14.314386.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T22_27_14.314386", "path": ["**/details_harness|gsm8k|5_2023-10-23T22-27-14.314386.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T22-27-14.314386.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hellaswag|10_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T09-52-20.607338.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T09-52-20.607338.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T22_27_14.314386", "path": ["**/details_harness|winogrande|5_2023-10-23T22-27-14.314386.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T22-27-14.314386.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T09_52_20.607338", "path": ["results_2023-09-12T09-52-20.607338.parquet"]}, {"split": "2023_10_23T22_27_14.314386", "path": ["results_2023-10-23T22-27-14.314386.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T22-27-14.314386.parquet"]}]}]}
2023-10-23T21:27:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Sao10K/Medusa-1.1-L2-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Sao10K/Medusa-1.1-L2-7B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T22:27:14.314386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Sao10K/Medusa-1.1-L2-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Medusa-1.1-L2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T22:27:14.314386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Sao10K/Medusa-1.1-L2-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Medusa-1.1-L2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T22:27:14.314386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 22, 31, 170, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Medusa-1.1-L2-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Medusa-1.1-L2-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T22:27:14.314386(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
596e9674183d10651e84a283c08e66ed91d3fe3c
# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B", "harness_truthfulqa_mc_0", split="train") ``` ## Latest results These are the [latest results from run 2023-09-12T09:58:38.972064](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B/blob/main/results_2023-09-12T09-58-38.972064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4674367304116677, "acc_stderr": 0.035284344124032196, "acc_norm": 0.4714260290393888, "acc_norm_stderr": 0.03526985338617593, "mc1": 0.2631578947368421, "mc1_stderr": 0.015415241740237014, "mc2": 0.3961362396399567, "mc2_stderr": 0.013785031017759436 }, "harness|arc:challenge|25": { "acc": 0.4948805460750853, "acc_stderr": 0.014610624890309157, "acc_norm": 0.5332764505119454, "acc_norm_stderr": 0.014578995859605802 }, "harness|hellaswag|10": { "acc": 0.5902210714997013, "acc_stderr": 0.004907877144720015, "acc_norm": 0.7871937860983867, "acc_norm_stderr": 0.004084552641903664 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750575, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750575 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.40789473684210525, "acc_stderr": 0.03999309712777471, "acc_norm": 0.40789473684210525, "acc_norm_stderr": 0.03999309712777471 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4377358490566038, "acc_stderr": 0.030533338430467516, "acc_norm": 0.4377358490566038, "acc_norm_stderr": 0.030533338430467516 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4583333333333333, "acc_stderr": 0.04166666666666665, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.04166666666666665 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.34, "acc_stderr": 0.047609522856952344, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952344 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.43352601156069365, "acc_stderr": 0.03778621079092055, "acc_norm": 0.43352601156069365, "acc_norm_stderr": 0.03778621079092055 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.19607843137254902, "acc_stderr": 0.03950581861179964, "acc_norm": 0.19607843137254902, "acc_norm_stderr": 0.03950581861179964 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.425531914893617, "acc_stderr": 0.03232146916224469, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.03232146916224469 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2982456140350877, "acc_stderr": 0.04303684033537315, "acc_norm": 0.2982456140350877, "acc_norm_stderr": 0.04303684033537315 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.42758620689655175, "acc_stderr": 0.04122737111370331, "acc_norm": 0.42758620689655175, "acc_norm_stderr": 0.04122737111370331 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2724867724867725, "acc_stderr": 0.022930973071633363, "acc_norm": 0.2724867724867725, "acc_norm_stderr": 0.022930973071633363 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.04190596438871136, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.04190596438871136 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5064516129032258, "acc_stderr": 0.02844163823354051, "acc_norm": 0.5064516129032258, "acc_norm_stderr": 0.02844163823354051 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.33497536945812806, "acc_stderr": 0.033208527423483104, "acc_norm": 0.33497536945812806, "acc_norm_stderr": 0.033208527423483104 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5818181818181818, "acc_stderr": 0.03851716319398393, "acc_norm": 0.5818181818181818, "acc_norm_stderr": 0.03851716319398393 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5151515151515151, "acc_stderr": 0.03560716516531061, "acc_norm": 0.5151515151515151, "acc_norm_stderr": 0.03560716516531061 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6580310880829016, "acc_stderr": 0.03423465100104283, "acc_norm": 0.6580310880829016, "acc_norm_stderr": 0.03423465100104283 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.44358974358974357, "acc_stderr": 0.025189149894764198, "acc_norm": 0.44358974358974357, "acc_norm_stderr": 0.025189149894764198 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2814814814814815, "acc_stderr": 0.027420019350945277, "acc_norm": 0.2814814814814815, "acc_norm_stderr": 0.027420019350945277 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.031968769891957786, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.031968769891957786 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2980132450331126, "acc_stderr": 0.03734535676787198, "acc_norm": 0.2980132450331126, "acc_norm_stderr": 0.03734535676787198 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6330275229357798, "acc_stderr": 0.020664675659520525, "acc_norm": 0.6330275229357798, "acc_norm_stderr": 0.020664675659520525 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.26851851851851855, "acc_stderr": 0.030225226160012383, "acc_norm": 0.26851851851851855, "acc_norm_stderr": 0.030225226160012383 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5735294117647058, "acc_stderr": 0.03471157907953427, "acc_norm": 0.5735294117647058, "acc_norm_stderr": 0.03471157907953427 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6244725738396625, "acc_stderr": 0.03152256243091156, "acc_norm": 0.6244725738396625, "acc_norm_stderr": 0.03152256243091156 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5605381165919282, "acc_stderr": 0.03331092511038179, "acc_norm": 0.5605381165919282, "acc_norm_stderr": 0.03331092511038179 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5725190839694656, "acc_stderr": 0.04338920305792401, "acc_norm": 0.5725190839694656, "acc_norm_stderr": 0.04338920305792401 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6115702479338843, "acc_stderr": 0.04449270350068383, "acc_norm": 0.6115702479338843, "acc_norm_stderr": 0.04449270350068383 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5185185185185185, "acc_stderr": 0.04830366024635331, "acc_norm": 0.5185185185185185, "acc_norm_stderr": 0.04830366024635331 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5214723926380368, "acc_stderr": 0.03924746876751129, "acc_norm": 0.5214723926380368, "acc_norm_stderr": 0.03924746876751129 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.38392857142857145, "acc_stderr": 0.04616143075028547, "acc_norm": 0.38392857142857145, "acc_norm_stderr": 0.04616143075028547 }, "harness|hendrycksTest-management|5": { "acc": 0.5631067961165048, "acc_stderr": 0.049111471073657764, "acc_norm": 0.5631067961165048, "acc_norm_stderr": 0.049111471073657764 }, "harness|hendrycksTest-marketing|5": { "acc": 0.6923076923076923, "acc_stderr": 0.03023638994217308, "acc_norm": 0.6923076923076923, "acc_norm_stderr": 0.03023638994217308 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6462324393358876, "acc_stderr": 0.017098184708161903, "acc_norm": 0.6462324393358876, "acc_norm_stderr": 0.017098184708161903 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5, "acc_stderr": 0.026919095102908273, "acc_norm": 0.5, "acc_norm_stderr": 0.026919095102908273 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24022346368715083, "acc_stderr": 0.014288343803925293, "acc_norm": 0.24022346368715083, "acc_norm_stderr": 0.014288343803925293 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.477124183006536, "acc_stderr": 0.028599936776089775, "acc_norm": 0.477124183006536, "acc_norm_stderr": 0.028599936776089775 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5884244372990354, "acc_stderr": 0.02795048149440127, "acc_norm": 0.5884244372990354, "acc_norm_stderr": 0.02795048149440127 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.4845679012345679, "acc_stderr": 0.0278074900442762, "acc_norm": 0.4845679012345679, "acc_norm_stderr": 0.0278074900442762 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3829787234042553, "acc_stderr": 0.02899908090480618, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.02899908090480618 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.363754889178618, "acc_stderr": 0.012286991879902884, "acc_norm": 0.363754889178618, "acc_norm_stderr": 0.012286991879902884 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5330882352941176, "acc_stderr": 0.03030625772246832, "acc_norm": 0.5330882352941176, "acc_norm_stderr": 0.03030625772246832 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.46078431372549017, "acc_stderr": 0.020165523313907904, "acc_norm": 0.46078431372549017, "acc_norm_stderr": 0.020165523313907904 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5454545454545454, "acc_stderr": 0.04769300568972743, "acc_norm": 0.5454545454545454, "acc_norm_stderr": 0.04769300568972743 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5061224489795918, "acc_stderr": 0.03200682020163908, "acc_norm": 0.5061224489795918, "acc_norm_stderr": 0.03200682020163908 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6318407960199005, "acc_stderr": 0.03410410565495301, "acc_norm": 0.6318407960199005, "acc_norm_stderr": 0.03410410565495301 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-virology|5": { "acc": 0.40963855421686746, "acc_stderr": 0.03828401115079022, "acc_norm": 0.40963855421686746, "acc_norm_stderr": 0.03828401115079022 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7017543859649122, "acc_stderr": 0.03508771929824563, "acc_norm": 0.7017543859649122, "acc_norm_stderr": 0.03508771929824563 }, "harness|truthfulqa:mc|0": { "mc1": 0.2631578947368421, "mc1_stderr": 0.015415241740237014, "mc2": 0.3961362396399567, "mc2_stderr": 0.013785031017759436 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B
[ "region:us" ]
2023-09-12T08:58:54+00:00
{"pretty_name": "Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B](https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-12T09:58:38.972064](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__LLaMa-2-PeanutButter_v19_R8-7B/blob/main/results_2023-09-12T09-58-38.972064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4674367304116677,\n \"acc_stderr\": 0.035284344124032196,\n \"acc_norm\": 0.4714260290393888,\n \"acc_norm_stderr\": 0.03526985338617593,\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.015415241740237014,\n \"mc2\": 0.3961362396399567,\n \"mc2_stderr\": 0.013785031017759436\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4948805460750853,\n \"acc_stderr\": 0.014610624890309157,\n \"acc_norm\": 0.5332764505119454,\n \"acc_norm_stderr\": 0.014578995859605802\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5902210714997013,\n \"acc_stderr\": 0.004907877144720015,\n \"acc_norm\": 0.7871937860983867,\n \"acc_norm_stderr\": 0.004084552641903664\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.030533338430467516,\n \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.030533338430467516\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633363,\n \"acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633363\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5064516129032258,\n \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.5064516129032258,\n \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398393,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398393\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.03423465100104283,\n \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.03423465100104283\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6330275229357798,\n \"acc_stderr\": 0.020664675659520525,\n \"acc_norm\": 0.6330275229357798,\n \"acc_norm_stderr\": 0.020664675659520525\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012383,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012383\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5214723926380368,\n \"acc_stderr\": 0.03924746876751129,\n \"acc_norm\": 0.5214723926380368,\n \"acc_norm_stderr\": 0.03924746876751129\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.03023638994217308,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.03023638994217308\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6462324393358876,\n \"acc_stderr\": 0.017098184708161903,\n \"acc_norm\": 0.6462324393358876,\n \"acc_norm_stderr\": 0.017098184708161903\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n \"acc_stderr\": 0.014288343803925293,\n \"acc_norm\": 0.24022346368715083,\n \"acc_norm_stderr\": 0.014288343803925293\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089775,\n \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089775\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4845679012345679,\n \"acc_stderr\": 0.0278074900442762,\n \"acc_norm\": 0.4845679012345679,\n \"acc_norm_stderr\": 0.0278074900442762\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n \"acc_stderr\": 0.012286991879902884,\n \"acc_norm\": 0.363754889178618,\n \"acc_norm_stderr\": 0.012286991879902884\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.03030625772246832,\n \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.03030625772246832\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.020165523313907904,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.020165523313907904\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5061224489795918,\n \"acc_stderr\": 0.03200682020163908,\n \"acc_norm\": 0.5061224489795918,\n \"acc_norm_stderr\": 0.03200682020163908\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n \"mc1_stderr\": 0.015415241740237014,\n \"mc2\": 0.3961362396399567,\n \"mc2_stderr\": 0.013785031017759436\n }\n}\n```", "repo_url": "https://huggingface.co/PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|arc:challenge|25_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hellaswag|10_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T09-58-38.972064.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T09_58_38.972064", "path": ["results_2023-09-12T09-58-38.972064.parquet"]}, {"split": "latest", "path": ["results_2023-09-12T09-58-38.972064.parquet"]}]}]}
2023-09-12T08:59:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B on the Open LLM Leaderboard. The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-09-12T09:58:38.972064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-12T09:58:38.972064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-09-12T09:58:38.972064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 32, 31, 180, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model PeanutJar/LLaMa-2-PeanutButter_v19_R8-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-12T09:58:38.972064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
56b332e8a64ac4ba109288e8739690e6699768a3
# Megadiff, a dataset of source code changes Contains only single-function diffs. If you use Megadiff, please cite the following technical report: "[Megadiff: A Dataset of 600k Java Source Code Changes Categorized by Diff Size](http://arxiv.org/pdf/2108.04631)". Technical Report 2108.04631, Arxiv; 2021. ``` @techreport{megadiff, TITLE = {{Megadiff: A Dataset of 600k Java Source Code Changes Categorized by Diff Size}}, AUTHOR = {Martin Monperrus and Matias Martinez and He Ye and Fernanda Madeiral and Thomas Durieux and Zhongxing Yu}, URL = {http://arxiv.org/pdf/2108.04631}, INSTITUTION = {Arxiv}, NUMBER = {2108.04631}, YEAR = {2021}, } ```
ASSERT-KTH/megadiff-single-function
[ "size_categories:10K<n<100K", "language:code", "arxiv:2108.04631", "region:us" ]
2023-09-12T09:05:19+00:00
{"language": ["code"], "size_categories": ["10K<n<100K"], "pretty_name": "megadiff", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "diff", "dtype": "string"}, {"name": "is_single_chunk", "dtype": "bool"}, {"name": "is_single_function", "dtype": "bool"}, {"name": "buggy_function", "dtype": "string"}, {"name": "fixed_function", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1624059115.752317, "num_examples": 72393}], "download_size": 546172221, "dataset_size": 1624059115.752317}}
2023-09-12T09:08:06+00:00
[ "2108.04631" ]
[ "code" ]
TAGS #size_categories-10K<n<100K #language-code #arxiv-2108.04631 #region-us
# Megadiff, a dataset of source code changes Contains only single-function diffs. If you use Megadiff, please cite the following technical report: "Megadiff: A Dataset of 600k Java Source Code Changes Categorized by Diff Size". Technical Report 2108.04631, Arxiv; 2021.
[ "# Megadiff, a dataset of source code changes\n\nContains only single-function diffs.\n\nIf you use Megadiff, please cite the following technical report:\n\n\"Megadiff: A Dataset of 600k Java Source Code Changes Categorized by Diff Size\". Technical Report 2108.04631, Arxiv; 2021." ]
[ "TAGS\n#size_categories-10K<n<100K #language-code #arxiv-2108.04631 #region-us \n", "# Megadiff, a dataset of source code changes\n\nContains only single-function diffs.\n\nIf you use Megadiff, please cite the following technical report:\n\n\"Megadiff: A Dataset of 600k Java Source Code Changes Categorized by Diff Size\". Technical Report 2108.04631, Arxiv; 2021." ]
[ 31, 71 ]
[ "passage: TAGS\n#size_categories-10K<n<100K #language-code #arxiv-2108.04631 #region-us \n# Megadiff, a dataset of source code changes\n\nContains only single-function diffs.\n\nIf you use Megadiff, please cite the following technical report:\n\n\"Megadiff: A Dataset of 600k Java Source Code Changes Categorized by Diff Size\". Technical Report 2108.04631, Arxiv; 2021." ]
eeccc9dcf6b91998a9f1055e4b8b1f4e927b77e8
# Dataset Card for "medqa_alpaca_format" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
maximegmd/medqa_alpaca_format
[ "region:us" ]
2023-09-12T09:05:52+00:00
{"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "solution", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1184018, "num_examples": 1273}, {"name": "train", "num_bytes": 9249332, "num_examples": 10178}], "download_size": 5933919, "dataset_size": 10433350}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}]}]}
2023-09-12T10:27:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for "medqa_alpaca_format" More Information needed
[ "# Dataset Card for \"medqa_alpaca_format\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"medqa_alpaca_format\"\n\nMore Information needed" ]
[ 6, 17 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"medqa_alpaca_format\"\n\nMore Information needed" ]
968feb4f3b8378be484dd3ab65985ea9101ac3c7
# Dataset of koshimizu_sachiko/輿水幸子/코시미즈사치코 (THE iDOLM@STER: Cinderella Girls) This is the dataset of koshimizu_sachiko/輿水幸子/코시미즈사치코 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are `short_hair, purple_hair, brown_eyes, hair_ornament, hairclip`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 540.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 339.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1183 | 712.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 491.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1183 | 962.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koshimizu_sachiko_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/koshimizu_sachiko_idolmastercinderellagirls', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, school_uniform, solo, :d, blush, looking_at_viewer, open_mouth, bow, hand_on_own_cheek | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, school_uniform, solo, :d, blush, open_mouth, skirt_lift, black_thighhighs, bow, grey_hair, looking_at_viewer | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_thighhighs, open_mouth, solo, wrist_cuffs, :d, black_wings, blush, looking_at_viewer, dress | | 3 | 22 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, solo, blush, long_sleeves, simple_background, white_background, bangs, hair_intakes, looking_at_viewer, open_mouth, upper_body, shirt, yellow_bowtie, skirt, :d, hair_flaps | | 4 | 24 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | cleavage_cutout, collar, elbow_gloves, heart_cutout, 1girl, skirt, solo, chain, wings, cuffs, navel, :d, open_mouth, midriff, black_thighhighs, blush, grey_hair, looking_at_viewer, microphone, garter_straps, pinstripe_pattern | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, heart, looking_at_viewer, puffy_short_sleeves, solo, white_gloves, witch_hat, bangs, bat_(animal), blush, frilled_skirt, hair_flaps, striped, thighhighs, :3, :d, center_frills, hair_intakes, jack-o'-lantern, open_mouth, pumpkin, boots, bowtie, cape, ghost, happy_halloween, high_heels, holding_wand, jewelry, mismatched_legwear, white_shirt | | 6 | 10 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, bangs, blue_skirt, long_sleeves, pleated_skirt, suspender_skirt, white_shirt, center_frills, hair_flaps, solo, closed_mouth, collared_shirt, hair_intakes, simple_background, blush, :3, smile | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, bangs, blue_skirt, collared_shirt, long_sleeves, suspender_skirt, white_shirt, blush, hair_flaps, simple_background, solo, white_background, hair_intakes, open_mouth, vertical-striped_skirt, :d, looking_at_viewer, center_frills, frilled_skirt, light_purple_hair, necktie, pleated_skirt, purple_ascot | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, nipples, smile, looking_at_viewer, navel, nude, pussy, small_breasts, solo, censored, lying, open_mouth | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, demon_girl, demon_horns, demon_tail, smile, solo, bare_shoulders, blush, demon_wings, detached_sleeves, looking_at_viewer, black_thighhighs, dress, purple_wings, simple_background, skirt, detached_collar, heart, open_mouth, white_background | | 10 | 6 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, blush, heart-shaped_pupils, navel, open_mouth, sweat, 1boy, :d, drooling, flat_chest, hetero, nipples, saliva, solo_focus, tears, happy_sex, penis, side-tie_bikini_bottom, torogao, vaginal, bar_censor, looking_at_viewer, on_back, spread_legs | | 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | 1girl, purple_dress, solo, bangs, bare_shoulders, black_gloves, looking_at_viewer, blush, hair_flower, black_hairband, black_wings, mini_crown, smile, yellow_eyes | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | solo | :d | blush | looking_at_viewer | open_mouth | bow | hand_on_own_cheek | skirt_lift | black_thighhighs | grey_hair | wrist_cuffs | black_wings | dress | long_sleeves | simple_background | white_background | bangs | hair_intakes | upper_body | shirt | yellow_bowtie | skirt | hair_flaps | cleavage_cutout | collar | elbow_gloves | heart_cutout | chain | wings | cuffs | navel | midriff | microphone | garter_straps | pinstripe_pattern | heart | puffy_short_sleeves | white_gloves | witch_hat | bat_(animal) | frilled_skirt | striped | thighhighs | :3 | center_frills | jack-o'-lantern | pumpkin | boots | bowtie | cape | ghost | happy_halloween | high_heels | holding_wand | jewelry | mismatched_legwear | white_shirt | blue_skirt | pleated_skirt | suspender_skirt | closed_mouth | collared_shirt | smile | vertical-striped_skirt | light_purple_hair | necktie | purple_ascot | nipples | nude | pussy | small_breasts | censored | lying | demon_girl | demon_horns | demon_tail | bare_shoulders | demon_wings | detached_sleeves | purple_wings | detached_collar | heart-shaped_pupils | sweat | 1boy | drooling | flat_chest | hetero | saliva | solo_focus | tears | happy_sex | penis | side-tie_bikini_bottom | torogao | vaginal | bar_censor | on_back | spread_legs | purple_dress | black_gloves | hair_flower | black_hairband | mini_crown | yellow_eyes | |----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:-------|:-----|:--------|:--------------------|:-------------|:------|:--------------------|:-------------|:-------------------|:------------|:--------------|:--------------|:--------|:---------------|:--------------------|:-------------------|:--------|:---------------|:-------------|:--------|:----------------|:--------|:-------------|:------------------|:---------|:---------------|:---------------|:--------|:--------|:--------|:--------|:----------|:-------------|:----------------|:--------------------|:--------|:----------------------|:---------------|:------------|:---------------|:----------------|:----------|:-------------|:-----|:----------------|:------------------|:----------|:--------|:---------|:-------|:--------|:------------------|:-------------|:---------------|:----------|:---------------------|:--------------|:-------------|:----------------|:------------------|:---------------|:-----------------|:--------|:-------------------------|:--------------------|:----------|:---------------|:----------|:-------|:--------|:----------------|:-----------|:--------|:-------------|:--------------|:-------------|:-----------------|:--------------|:-------------------|:---------------|:------------------|:----------------------|:--------|:-------|:-----------|:-------------|:---------|:---------|:-------------|:--------|:------------|:--------|:-------------------------|:----------|:----------|:-------------|:----------|:--------------|:---------------|:---------------|:--------------|:-----------------|:-------------|:--------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | X | X | X | X | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 22 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | X | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 24 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | X | X | X | X | | | | X | X | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 5 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | X | X | X | X | | | | | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 6 | 10 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | X | | | | | | | | | | | X | X | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 7 | 9 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | X | X | X | X | X | | | | | | | | | X | X | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | X | X | X | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 8 | 7 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | X | | X | X | X | | | | X | | | | X | | X | X | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 10 | 6 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | 11 | 7 | ![](samples/11/clu11-sample0.png) | ![](samples/11/clu11-sample1.png) | ![](samples/11/clu11-sample2.png) | ![](samples/11/clu11-sample3.png) | ![](samples/11/clu11-sample4.png) | X | | X | | X | X | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
CyberHarem/koshimizu_sachiko_idolmastercinderellagirls
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T09:09:36+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T09:44:56+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of koshimizu\_sachiko/輿水幸子/코시미즈사치코 (THE iDOLM@STER: Cinderella Girls) ============================================================================= This is the dataset of koshimizu\_sachiko/輿水幸子/코시미즈사치코 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags. The core tags of this character are 'short\_hair, purple\_hair, brown\_eyes, hair\_ornament, hairclip', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]
a23b72eeae66bbcb9288fdb9dca363fa7754fbb6
# Dataset Card for Evaluation run of Sao10K/Stheno-1.1-L2-13B ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/Sao10K/Stheno-1.1-L2-13B - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.1-L2-13B](https://huggingface.co/Sao10K/Stheno-1.1-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-23T21:31:21.044971](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B/blob/main/results_2023-10-23T21-31-21.044971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.4002726510067114, "em_stderr": 0.00501758267554197, "f1": 0.46779362416107595, "f1_stderr": 0.00480612571124351, "acc": 0.41588812169655687, "acc_stderr": 0.0097951551792708 }, "harness|drop|3": { "em": 0.4002726510067114, "em_stderr": 0.00501758267554197, "f1": 0.46779362416107595, "f1_stderr": 0.00480612571124351 }, "harness|gsm8k|5": { "acc": 0.07960576194086429, "acc_stderr": 0.007455924338676254 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.012134386019865346 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B
[ "region:us" ]
2023-09-12T09:14:29+00:00
{"pretty_name": "Evaluation run of Sao10K/Stheno-1.1-L2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Stheno-1.1-L2-13B](https://huggingface.co/Sao10K/Stheno-1.1-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T21:31:21.044971](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-1.1-L2-13B/blob/main/results_2023-10-23T21-31-21.044971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4002726510067114,\n \"em_stderr\": 0.00501758267554197,\n \"f1\": 0.46779362416107595,\n \"f1_stderr\": 0.00480612571124351,\n \"acc\": 0.41588812169655687,\n \"acc_stderr\": 0.0097951551792708\n },\n \"harness|drop|3\": {\n \"em\": 0.4002726510067114,\n \"em_stderr\": 0.00501758267554197,\n \"f1\": 0.46779362416107595,\n \"f1_stderr\": 0.00480612571124351\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07960576194086429,\n \"acc_stderr\": 0.007455924338676254\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865346\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Stheno-1.1-L2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|arc:challenge|25_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T21_31_21.044971", "path": ["**/details_harness|drop|3_2023-10-23T21-31-21.044971.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T21-31-21.044971.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T21_31_21.044971", "path": ["**/details_harness|gsm8k|5_2023-10-23T21-31-21.044971.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T21-31-21.044971.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hellaswag|10_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T10-14-13.361250.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T10-14-13.361250.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T10-14-13.361250.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T21_31_21.044971", "path": ["**/details_harness|winogrande|5_2023-10-23T21-31-21.044971.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T21-31-21.044971.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_12T10_14_13.361250", "path": ["results_2023-09-12T10-14-13.361250.parquet"]}, {"split": "2023_10_23T21_31_21.044971", "path": ["results_2023-10-23T21-31-21.044971.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T21-31-21.044971.parquet"]}]}]}
2023-10-23T20:31:34+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Sao10K/Stheno-1.1-L2-13B ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model Sao10K/Stheno-1.1-L2-13B on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-10-23T21:31:21.044971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of Sao10K/Stheno-1.1-L2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-1.1-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T21:31:21.044971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Sao10K/Stheno-1.1-L2-13B", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-1.1-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-10-23T21:31:21.044971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 171, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Stheno-1.1-L2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Stheno-1.1-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T21:31:21.044971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c43ce1afe7c90d205b33d599111c8974a1ce1cd9
# Dataset of hinatsu/ヒナツ (Pokémon) This is the dataset of hinatsu/ヒナツ (Pokémon), containing 500 images and their tags. The core tags of this character are `short_hair, red_hair, bangs, red_eyes, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:------------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 500 | 629.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinatsu_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 500 | 325.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinatsu_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 1251 | 695.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinatsu_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 500 | 541.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinatsu_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 1251 | 1018.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinatsu_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/hinatsu_pokemon', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, brown_bag, cowlick, hood, long_sleeves, solo, collarbone, gradient_legwear, looking_at_viewer, smile, blush, legwear_under_shorts, pantyhose_under_shorts, red_pantyhose, closed_mouth, hand_up, sitting, grey_jacket, simple_background | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blue_hoodie, blush, cowlick, gradient_legwear, legwear_under_shorts, long_sleeves, looking_at_viewer, pantyhose_under_shorts, smile, brown_bag, gradient_clothes, jacket, solo, ass, brown_hair, open_mouth, two-tone_legwear, closed_mouth, crossed_legs, red_pantyhose, sitting | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cowlick, gradient_legwear, long_sleeves, simple_background, white_background, black_footwear, boots, closed_mouth, gradient_clothes, smile, solo, brown_bag, full_body, hood, legwear_under_shorts, looking_at_viewer, sitting, hand_up, pantyhose_under_shorts, grey_jacket, red_pantyhose, black_shorts, blush, bracelet | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, ass, blue_hoodie, cowlick, from_behind, long_sleeves, looking_at_viewer, looking_back, solo, blush, gradient_legwear, simple_background, brown_pantyhose, closed_mouth, hood_down, jacket, open_mouth, white_background | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, long_sleeves, looking_at_viewer, red_pantyhose, smile, solo, blue_hoodie, from_behind, looking_back, cowlick, gradient_legwear, jacket, shiny_clothes, simple_background, thighs, ass_focus, blush, closed_mouth, white_background | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, blush, hetero, nipples, penis, pussy, sex, spread_legs, vaginal, large_breasts, navel, missionary, on_back, collarbone, completely_nude, solo_focus, open_mouth, pov, looking_at_viewer, mosaic_censoring, sweat, trembling, uncensored | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1boy, 1girl, ass, hetero, blush, cowlick, pantyhose, uncensored, vaginal, anus, torn_clothes, gradient_legwear, open_mouth, cum_in_pussy, overflow, clothed_female_nude_male, jacket, long_sleeves, heart, looking_at_viewer, looking_back, sex_from_behind, solo_focus, testicles, veiny_penis | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | brown_bag | cowlick | hood | long_sleeves | solo | collarbone | gradient_legwear | looking_at_viewer | smile | blush | legwear_under_shorts | pantyhose_under_shorts | red_pantyhose | closed_mouth | hand_up | sitting | grey_jacket | simple_background | blue_hoodie | gradient_clothes | jacket | ass | brown_hair | open_mouth | two-tone_legwear | crossed_legs | white_background | black_footwear | boots | full_body | black_shorts | bracelet | from_behind | looking_back | brown_pantyhose | hood_down | shiny_clothes | thighs | ass_focus | 1boy | hetero | nipples | penis | pussy | sex | spread_legs | vaginal | large_breasts | navel | missionary | on_back | completely_nude | solo_focus | pov | mosaic_censoring | sweat | trembling | uncensored | pantyhose | anus | torn_clothes | cum_in_pussy | overflow | clothed_female_nude_male | heart | sex_from_behind | testicles | veiny_penis | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:----------|:-------|:---------------|:-------|:-------------|:-------------------|:--------------------|:--------|:--------|:-----------------------|:-------------------------|:----------------|:---------------|:----------|:----------|:--------------|:--------------------|:--------------|:-------------------|:---------|:------|:-------------|:-------------|:-------------------|:---------------|:-------------------|:-----------------|:--------|:------------|:---------------|:-----------|:--------------|:---------------|:------------------|:------------|:----------------|:---------|:------------|:-------|:---------|:----------|:--------|:--------|:------|:--------------|:----------|:----------------|:--------|:-------------|:----------|:------------------|:-------------|:------|:-------------------|:--------|:------------|:-------------|:------------|:-------|:---------------|:---------------|:-----------|:---------------------------|:--------|:------------------|:------------|:--------------| | 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | X | | X | X | X | X | X | X | X | X | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 8 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | X | X | | X | X | | X | | | | X | | | | X | X | | X | X | | X | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 7 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | X | X | | X | X | X | X | | | X | X | | | | X | X | | X | | | | | | X | | | | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 5 | 9 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | | | | | X | | X | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 6 | 11 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | | X | | X | | | X | X | | X | | | | | | | | | | | X | X | | X | | | | | | | | | | X | | | | | | X | X | | | | | | X | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/hinatsu_pokemon
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2023-09-12T09:35:07+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-01-16T21:49:08+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of hinatsu/ヒナツ (Pokémon) ================================ This is the dataset of hinatsu/ヒナツ (Pokémon), containing 500 images and their tags. The core tags of this character are 'short\_hair, red\_hair, bangs, red\_eyes, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ 44, 61, 5, 4 ]
[ "passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version" ]