sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
listlengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
listlengths
0
25
languages
listlengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
listlengths
0
352
processed_texts
listlengths
1
353
tokens_length
listlengths
1
353
input_texts
listlengths
1
40
70dcbaa0899db7f083291f86dbabdd6f3dab8e6b
# TL;DR SFT Dataset for OpenAI's [Summarize from Feedback](https://openai.com/blog/summarization/) task The dataset is directly taken from https://github.com/openai/summarize-from-feedback/tree/700967448d10004279f138666442bf1497d0e705#reddit-tldr-dataset These columns are taken directly from the aforementioned dataset: * **id**: unique identifier for the post * **subreddit**: subreddit the post was taken from * **title**: title of the post * **post**: body of the post * **summary**: summary of the post * **reference_response**: reference response for the post These columns are added by this preprocessing script: * **query**: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last ` `. If it's too short it pads the main text ([summarize_from_feedback/tasks.py#L98-L165](https://github.com/openai/summarize-from-feedback/blob/700967448d10004279f138666442bf1497d0e705/summarize_from_feedback/tasks.py#L98-L165)). Padding is either space or `[PAD]` token (see Args below). * **query_token**: tokenized version of `query` * **reference_response_token**: tokenized version of `reference_response` * **reference_response_token_len**: length of `reference_response_token` * **query_reference_response**: concatenation of `query.strip()` and `reference_response` * **query_reference_response_token**: tokenized version of `query_reference_response`, up to `max_sft_query_response_length` tokens * **query_reference_response_token_len**: length of `query_reference_response_token` # Args ```python {'base_model': 'gpt2', 'hf_entity': 'vwxyzjn', 'max_rm_query_response_length': 665, 'max_rm_response_length': 153, 'max_sft_query_response_length': 560, 'max_sft_response_length': 48} {'format_str': 'SUBREDDIT: r/{subreddit}\n' '\n' 'TITLE: {title}\n' '\n' 'POST: {post}\n' '\n' 'TL;DR:', 'length': 512, 'pad_side': 'left', 'padding': [220], 'truncate_field': 'post', 'truncate_text': '\n'} ```
vwxyzjn/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_gpt2_48
[ "region:us" ]
2023-11-23T02:00:27+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "subreddit", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "post", "dtype": "string"}, {"name": "summary", "dtype": "string"}, {"name": "query_token", "sequence": "int64"}, {"name": "query", "dtype": "string"}, {"name": "reference_response", "dtype": "string"}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}, {"name": "query_reference_response", "dtype": "string"}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1594357961, "num_examples": 116722}, {"name": "validation", "num_bytes": 88082343, "num_examples": 6447}, {"name": "test", "num_bytes": 89575748, "num_examples": 6553}], "download_size": 0, "dataset_size": 1772016052}}
2023-12-24T22:50:33+00:00
[]
[]
TAGS #region-us
# TL;DR SFT Dataset for OpenAI's Summarize from Feedback task The dataset is directly taken from URL These columns are taken directly from the aforementioned dataset: * id: unique identifier for the post * subreddit: subreddit the post was taken from * title: title of the post * post: body of the post * summary: summary of the post * reference_response: reference response for the post These columns are added by this preprocessing script: * query: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last ' '. If it's too short it pads the main text (summarize_from_feedback/URL#L98-L165). Padding is either space or '[PAD]' token (see Args below). * query_token: tokenized version of 'query' * reference_response_token: tokenized version of 'reference_response' * reference_response_token_len: length of 'reference_response_token' * query_reference_response: concatenation of 'URL()' and 'reference_response' * query_reference_response_token: tokenized version of 'query_reference_response', up to 'max_sft_query_response_length' tokens * query_reference_response_token_len: length of 'query_reference_response_token' # Args
[ "# TL;DR SFT Dataset for OpenAI's Summarize from Feedback task\n\nThe dataset is directly taken from URL\n\nThese columns are taken directly from the aforementioned dataset:\n\n* id: unique identifier for the post\n* subreddit: subreddit the post was taken from\n* title: title of the post\n* post: body of the post\n* summary: summary of the post\n* reference_response: reference response for the post\n\nThese columns are added by this preprocessing script:\n* query: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last '\n'. If it's too short it pads the main text (summarize_from_feedback/URL#L98-L165). Padding is either space or '[PAD]' token (see Args below).\n* query_token: tokenized version of 'query'\n* reference_response_token: tokenized version of 'reference_response'\n* reference_response_token_len: length of 'reference_response_token'\n* query_reference_response: concatenation of 'URL()' and 'reference_response'\n* query_reference_response_token: tokenized version of 'query_reference_response', up to 'max_sft_query_response_length' tokens\n* query_reference_response_token_len: length of 'query_reference_response_token'", "# Args" ]
[ "TAGS\n#region-us \n", "# TL;DR SFT Dataset for OpenAI's Summarize from Feedback task\n\nThe dataset is directly taken from URL\n\nThese columns are taken directly from the aforementioned dataset:\n\n* id: unique identifier for the post\n* subreddit: subreddit the post was taken from\n* title: title of the post\n* post: body of the post\n* summary: summary of the post\n* reference_response: reference response for the post\n\nThese columns are added by this preprocessing script:\n* query: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last '\n'. If it's too short it pads the main text (summarize_from_feedback/URL#L98-L165). Padding is either space or '[PAD]' token (see Args below).\n* query_token: tokenized version of 'query'\n* reference_response_token: tokenized version of 'reference_response'\n* reference_response_token_len: length of 'reference_response_token'\n* query_reference_response: concatenation of 'URL()' and 'reference_response'\n* query_reference_response_token: tokenized version of 'query_reference_response', up to 'max_sft_query_response_length' tokens\n* query_reference_response_token_len: length of 'query_reference_response_token'", "# Args" ]
[ 6, 384, 3 ]
[ "passage: TAGS\n#region-us \n# TL;DR SFT Dataset for OpenAI's Summarize from Feedback task\n\nThe dataset is directly taken from URL\n\nThese columns are taken directly from the aforementioned dataset:\n\n* id: unique identifier for the post\n* subreddit: subreddit the post was taken from\n* title: title of the post\n* post: body of the post\n* summary: summary of the post\n* reference_response: reference response for the post\n\nThese columns are added by this preprocessing script:\n* query: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last '\n'. If it's too short it pads the main text (summarize_from_feedback/URL#L98-L165). Padding is either space or '[PAD]' token (see Args below).\n* query_token: tokenized version of 'query'\n* reference_response_token: tokenized version of 'reference_response'\n* reference_response_token_len: length of 'reference_response_token'\n* query_reference_response: concatenation of 'URL()' and 'reference_response'\n* query_reference_response_token: tokenized version of 'query_reference_response', up to 'max_sft_query_response_length' tokens\n* query_reference_response_token_len: length of 'query_reference_response_token'# Args" ]
b19d205f93f331737b266d510b5f812a1b8941db
# Dataset Card for "undl_es2en_translation" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bot-yaya/undl_es2en_translation
[ "region:us" ]
2023-11-23T02:07:25+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "clean_es", "sequence": "string"}, {"name": "clean_en", "sequence": "string"}, {"name": "record", "dtype": "string"}, {"name": "es2en", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 15044031141, "num_examples": 165840}], "download_size": 0, "dataset_size": 15044031141}}
2023-11-23T11:55:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "undl_es2en_translation" More Information needed
[ "# Dataset Card for \"undl_es2en_translation\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"undl_es2en_translation\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"undl_es2en_translation\"\n\nMore Information needed" ]
7ecccfba0ce82414edb5c06a756b812159d51867
# Dataset Card for "summarize_from_feedback_oai_preprocessing_gpt2_48" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
vwxyzjn/summarize_from_feedback_oai_preprocessing_gpt2_48
[ "region:us" ]
2023-11-23T02:07:48+00:00
{"dataset_info": {"features": [{"name": "info", "struct": [{"name": "id", "dtype": "string"}, {"name": "post", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "subreddit", "dtype": "string"}, {"name": "site", "dtype": "string"}, {"name": "article", "dtype": "string"}]}, {"name": "summaries", "list": [{"name": "text", "dtype": "string"}, {"name": "policy", "dtype": "string"}, {"name": "note", "dtype": "string"}]}, {"name": "choice", "dtype": "int32"}, {"name": "worker", "dtype": "string"}, {"name": "batch", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "extra", "struct": [{"name": "confidence", "dtype": "int32"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query", "dtype": "string"}, {"name": "response0", "dtype": "string"}, {"name": "response0_token", "sequence": "int64"}, {"name": "response0_token_len", "dtype": "int64"}, {"name": "response1", "dtype": "string"}, {"name": "response1_token", "sequence": "int64"}, {"name": "response1_token_len", "dtype": "int64"}, {"name": "response0_policy", "dtype": "string"}, {"name": "response1_policy", "dtype": "string"}, {"name": "policies", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 790831734, "num_examples": 92858}, {"name": "validation", "num_bytes": 743452770, "num_examples": 86086}], "download_size": 125252937, "dataset_size": 1534284504}}
2023-11-28T19:23:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "summarize_from_feedback_oai_preprocessing_gpt2_48" More Information needed
[ "# Dataset Card for \"summarize_from_feedback_oai_preprocessing_gpt2_48\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"summarize_from_feedback_oai_preprocessing_gpt2_48\"\n\nMore Information needed" ]
[ 6, 31 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"summarize_from_feedback_oai_preprocessing_gpt2_48\"\n\nMore Information needed" ]
89056437a6c4b4915c9c2a9351803beb3ac6f9ce
# Dataset Card for "undl_ar2en_translation" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bot-yaya/undl_ar2en_translation
[ "region:us" ]
2023-11-23T02:09:06+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "clean_ar", "sequence": "string"}, {"name": "clean_en", "sequence": "string"}, {"name": "record", "dtype": "string"}, {"name": "ar2en", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 16560948916, "num_examples": 165840}], "download_size": 0, "dataset_size": 16560948916}}
2023-11-23T11:54:22+00:00
[]
[]
TAGS #region-us
# Dataset Card for "undl_ar2en_translation" More Information needed
[ "# Dataset Card for \"undl_ar2en_translation\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"undl_ar2en_translation\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"undl_ar2en_translation\"\n\nMore Information needed" ]
8799480e41f697a85653d59f33a95551d2863aa7
# Dataset Card for "mal-url-treat-no-trunc" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bgspaditya/mal-url-treat-no-trunc
[ "region:us" ]
2023-11-23T02:12:13+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "type_code", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 42427560.8, "num_examples": 512900}, {"name": "val", "num_bytes": 5303403.7395390915, "num_examples": 64112}, {"name": "test", "num_bytes": 5303486.460460909, "num_examples": 64113}], "download_size": 32110906, "dataset_size": 53034451.0}}
2023-11-23T02:12:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mal-url-treat-no-trunc" More Information needed
[ "# Dataset Card for \"mal-url-treat-no-trunc\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mal-url-treat-no-trunc\"\n\nMore Information needed" ]
[ 6, 21 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mal-url-treat-no-trunc\"\n\nMore Information needed" ]
9d25f8f9016e38f7b580ef8a99c83713d43b5bf4
### Description 🙅‍♂️🤖 GhanaNews dataset is a collection of news articles from various Ghanaian News Portals (MyJoyOnline, GraphicOnline, GhanaWeb, PulseGh, CitiNewsOnline, ect). The dataset is provided by the academic comunity for research purposes in data mining (clustering, classification, etc), information retrieval (ranking, search, etc), xml, data compression, data streaming, and any other non-commercial activity. The Ghana news topic classification dataset is constructed by Theophilus Siameh ([email protected]) from the dataset above. ### Context QA: in context question answering from an article ```shell {"article": "...", "question": "...", "answer": "..."} ``` ### Article and Summary ```shell {"article": "...", "summary": "..."} ``` ### Dataset Format ```shell { "title": "...", "content": "...", "author": "...", "category": "...", "published_date": "...", "page_url": "..." } ``` ### Load Dataset ```shell pip install datasets ``` ```python from datasets import load_dataset train = load_dataset("worldboss/ghana-news", split="train") test = load_dataset("worldboss/ghana-news", split="test") pd.DataFrame(train).head() ```
worldboss/ghana-news
[ "task_categories:conversational", "task_categories:text-generation", "task_categories:summarization", "task_categories:question-answering", "task_categories:text-classification", "task_categories:text-retrieval", "task_categories:translation", "size_categories:10K<n<100K", "language:en", "license:apache-2.0", "ghana", "news", "politics", "science", "business", "ghana-news", "region:us" ]
2023-11-23T02:46:40+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational", "text-generation", "summarization", "question-answering", "text-classification", "text-retrieval", "translation"], "pretty_name": "No Robots", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "author", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "published_date", "dtype": "string"}, {"name": "page_url", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 81377758, "num_examples": 30374}, {"name": "test", "num_bytes": 12272769, "num_examples": 9489}], "download_size": 0, "dataset_size": 93650527}, "tags": ["ghana", "news", "politics", "science", "business", "ghana-news"]}
2023-12-15T05:57:49+00:00
[]
[ "en" ]
TAGS #task_categories-conversational #task_categories-text-generation #task_categories-summarization #task_categories-question-answering #task_categories-text-classification #task_categories-text-retrieval #task_categories-translation #size_categories-10K<n<100K #language-English #license-apache-2.0 #ghana #news #politics #science #business #ghana-news #region-us
### Description ‍️ GhanaNews dataset is a collection of news articles from various Ghanaian News Portals (MyJoyOnline, GraphicOnline, GhanaWeb, PulseGh, CitiNewsOnline, ect). The dataset is provided by the academic comunity for research purposes in data mining (clustering, classification, etc), information retrieval (ranking, search, etc), xml, data compression, data streaming, and any other non-commercial activity. The Ghana news topic classification dataset is constructed by Theophilus Siameh (theodondre@URL) from the dataset above. ### Context QA: in context question answering from an article ### Article and Summary ### Dataset Format ### Load Dataset
[ "### Description ‍️\nGhanaNews dataset is a collection of news articles from various Ghanaian News Portals (MyJoyOnline, GraphicOnline, GhanaWeb, PulseGh, CitiNewsOnline, ect). The dataset is provided by the academic comunity for research purposes in data mining (clustering, classification, etc), information retrieval (ranking, search, etc), xml, data compression, data streaming, and any other non-commercial activity.\n\nThe Ghana news topic classification dataset is constructed by Theophilus Siameh (theodondre@URL) from the dataset above.", "### Context QA: in context question answering from an article", "### Article and Summary", "### Dataset Format", "### Load Dataset" ]
[ "TAGS\n#task_categories-conversational #task_categories-text-generation #task_categories-summarization #task_categories-question-answering #task_categories-text-classification #task_categories-text-retrieval #task_categories-translation #size_categories-10K<n<100K #language-English #license-apache-2.0 #ghana #news #politics #science #business #ghana-news #region-us \n", "### Description ‍️\nGhanaNews dataset is a collection of news articles from various Ghanaian News Portals (MyJoyOnline, GraphicOnline, GhanaWeb, PulseGh, CitiNewsOnline, ect). The dataset is provided by the academic comunity for research purposes in data mining (clustering, classification, etc), information retrieval (ranking, search, etc), xml, data compression, data streaming, and any other non-commercial activity.\n\nThe Ghana news topic classification dataset is constructed by Theophilus Siameh (theodondre@URL) from the dataset above.", "### Context QA: in context question answering from an article", "### Article and Summary", "### Dataset Format", "### Load Dataset" ]
[ 122, 137, 15, 6, 5, 6 ]
[ "passage: TAGS\n#task_categories-conversational #task_categories-text-generation #task_categories-summarization #task_categories-question-answering #task_categories-text-classification #task_categories-text-retrieval #task_categories-translation #size_categories-10K<n<100K #language-English #license-apache-2.0 #ghana #news #politics #science #business #ghana-news #region-us \n### Description ‍️\nGhanaNews dataset is a collection of news articles from various Ghanaian News Portals (MyJoyOnline, GraphicOnline, GhanaWeb, PulseGh, CitiNewsOnline, ect). The dataset is provided by the academic comunity for research purposes in data mining (clustering, classification, etc), information retrieval (ranking, search, etc), xml, data compression, data streaming, and any other non-commercial activity.\n\nThe Ghana news topic classification dataset is constructed by Theophilus Siameh (theodondre@URL) from the dataset above.### Context QA: in context question answering from an article### Article and Summary### Dataset Format### Load Dataset" ]
4b7aaa9104cb677e0279e60b36069af6cd13cc3b
# Dataset Card for "affixal_nonneg" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
joey234/affixal_nonneg
[ "region:us" ]
2023-11-23T03:11:26+00:00
{"dataset_info": {"features": [{"name": "word", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 6870, "num_examples": 330}], "download_size": 4643, "dataset_size": 6870}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-24T02:57:08+00:00
[]
[]
TAGS #region-us
# Dataset Card for "affixal_nonneg" More Information needed
[ "# Dataset Card for \"affixal_nonneg\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"affixal_nonneg\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"affixal_nonneg\"\n\nMore Information needed" ]
36ce2a0afb1e531a226c4c9724cb1b3ee0f42312
spider dataset with user question, column names and table names only, the spider train_other also include.
gaoyzz/spider_SQL_prompts
[ "license:apache-2.0", "region:us" ]
2023-11-23T03:25:29+00:00
{"license": "apache-2.0"}
2023-11-23T05:01:35+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
spider dataset with user question, column names and table names only, the spider train_other also include.
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
[ 14 ]
[ "passage: TAGS\n#license-apache-2.0 #region-us \n" ]
47d3cbf2245330f712427476b5f050fabde9ef02
# Dataset Card for "b50562e5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
result-kand2-sdxl-wuerst-karlo/b50562e5
[ "region:us" ]
2023-11-23T03:38:22+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 170, "num_examples": 10}], "download_size": 1334, "dataset_size": 170}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T03:38:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "b50562e5" More Information needed
[ "# Dataset Card for \"b50562e5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"b50562e5\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"b50562e5\"\n\nMore Information needed" ]
1427d0b32c1d8ea1d57a132dcec8105c683a69e0
# Dataset Card for "indic-superb-augmented" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
collabora/indic-superb-augmented
[ "region:us" ]
2023-11-23T03:51:35+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "transcription", "dtype": "string"}, {"name": "duration", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 46157129192.0, "num_examples": 24872}, {"name": "test", "num_bytes": 1594957930.0, "num_examples": 872}], "download_size": 47266159084, "dataset_size": 47752087122.0}}
2023-11-23T04:22:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for "indic-superb-augmented" More Information needed
[ "# Dataset Card for \"indic-superb-augmented\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"indic-superb-augmented\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"indic-superb-augmented\"\n\nMore Information needed" ]
4cc33f0ebe8b27733035e097b7bd4976fc1a786d
# SayCan This repo contains the data for ["Do As I Can, Not As I Say: Grounding Language in Robotic Affordances"](https://say-can.github.io). The original data link is here: https://raw.githubusercontent.com/say-can/say-can.github.io/main/data/saycan_plan_v0_l.tsv This dataset is distributed with the CC BY 4.0 license.
chiayewken/saycan
[ "region:us" ]
2023-11-23T04:17:30+00:00
{"dataset_info": {"features": [{"name": "INPUT", "dtype": "string"}, {"name": "OUTPUT", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 14865, "num_examples": 99}], "download_size": 4765, "dataset_size": 14865}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]}
2023-11-23T04:21:11+00:00
[]
[]
TAGS #region-us
# SayCan This repo contains the data for "Do As I Can, Not As I Say: Grounding Language in Robotic Affordances". The original data link is here: URL This dataset is distributed with the CC BY 4.0 license.
[ "# SayCan\n\nThis repo contains the data for \"Do As I Can, Not As I Say:\nGrounding Language in Robotic Affordances\".\n\nThe original data link is here: URL\n\nThis dataset is distributed with the CC BY 4.0 license." ]
[ "TAGS\n#region-us \n", "# SayCan\n\nThis repo contains the data for \"Do As I Can, Not As I Say:\nGrounding Language in Robotic Affordances\".\n\nThe original data link is here: URL\n\nThis dataset is distributed with the CC BY 4.0 license." ]
[ 6, 52 ]
[ "passage: TAGS\n#region-us \n# SayCan\n\nThis repo contains the data for \"Do As I Can, Not As I Say:\nGrounding Language in Robotic Affordances\".\n\nThe original data link is here: URL\n\nThis dataset is distributed with the CC BY 4.0 license." ]
471e694083ccd3e891a7e1af27b6792314e2832f
# Dataset Card for "undl_es2en_aligned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bot-yaya/undl_es2en_aligned
[ "region:us" ]
2023-11-23T04:21:00+00:00
{"dataset_info": {"features": [{"name": "record", "dtype": "string"}, {"name": "clean_para_index_set_pair", "dtype": "string"}, {"name": "src", "dtype": "string"}, {"name": "dst", "dtype": "string"}, {"name": "src_text", "dtype": "string"}, {"name": "dst_text", "dtype": "string"}, {"name": "src_rate", "dtype": "float64"}, {"name": "dst_rate", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 10706600254, "num_examples": 15967431}], "download_size": 0, "dataset_size": 10706600254}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T11:55:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for "undl_es2en_aligned" More Information needed
[ "# Dataset Card for \"undl_es2en_aligned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"undl_es2en_aligned\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"undl_es2en_aligned\"\n\nMore Information needed" ]
372f47388286cf19d8479f281294d3120e878fcb
# Dataset Card for "undl_ar2en_aligned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bot-yaya/undl_ar2en_aligned
[ "region:us" ]
2023-11-23T04:28:32+00:00
{"dataset_info": {"features": [{"name": "record", "dtype": "string"}, {"name": "clean_para_index_set_pair", "dtype": "string"}, {"name": "src", "dtype": "string"}, {"name": "dst", "dtype": "string"}, {"name": "src_text", "dtype": "string"}, {"name": "dst_text", "dtype": "string"}, {"name": "src_rate", "dtype": "float64"}, {"name": "dst_rate", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 12012712129, "num_examples": 15217906}], "download_size": 0, "dataset_size": 12012712129}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T11:54:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "undl_ar2en_aligned" More Information needed
[ "# Dataset Card for \"undl_ar2en_aligned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"undl_ar2en_aligned\"\n\nMore Information needed" ]
[ 6, 19 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"undl_ar2en_aligned\"\n\nMore Information needed" ]
a245ad24ecbde7584996f00b06306db393cf0517
# Dataset
nsjzg/dataset1
[ "task_categories:image-classification", "size_categories:n<1K", "language:en", "human face", "region:us" ]
2023-11-23T05:35:02+00:00
{"language": ["en"], "size_categories": ["n<1K"], "task_categories": ["image-classification"], "tags": ["human face"]}
2023-11-27T03:41:15+00:00
[]
[ "en" ]
TAGS #task_categories-image-classification #size_categories-n<1K #language-English #human face #region-us
# Dataset
[ "# Dataset" ]
[ "TAGS\n#task_categories-image-classification #size_categories-n<1K #language-English #human face #region-us \n", "# Dataset" ]
[ 34, 3 ]
[ "passage: TAGS\n#task_categories-image-classification #size_categories-n<1K #language-English #human face #region-us \n# Dataset" ]
9393b0fcd74bd4dd6917e5a98f808a551a7a5048
# Dataset Card for "find_sent_before_sent_train_400_eval_40_first_permute" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_before_sent_train_400_eval_40_first_permute
[ "region:us" ]
2023-11-23T06:10:41+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5869405.081244598, "num_examples": 4188}, {"name": "validation", "num_bytes": 232610, "num_examples": 200}], "download_size": 1246434, "dataset_size": 6102015.081244598}}
2023-11-23T06:10:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_before_sent_train_400_eval_40_first_permute" More Information needed
[ "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_first_permute\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_first_permute\"\n\nMore Information needed" ]
[ 6, 35 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40_first_permute\"\n\nMore Information needed" ]
84b7ea7a73e452562b204369dc770e0f98de1572
# Dataset Card for "find_sent_before_sent_train_400_eval_40_last_permute" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_before_sent_train_400_eval_40_last_permute
[ "region:us" ]
2023-11-23T06:10:49+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5869405.081244598, "num_examples": 4188}, {"name": "validation", "num_bytes": 232610, "num_examples": 200}], "download_size": 1243567, "dataset_size": 6102015.081244598}}
2023-11-23T06:10:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_before_sent_train_400_eval_40_last_permute" More Information needed
[ "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_last_permute\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_last_permute\"\n\nMore Information needed" ]
[ 6, 34 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40_last_permute\"\n\nMore Information needed" ]
447d7d0ebb2b43e37f56cf47db5d0993b27ff6f4
# Dataset Card for "find_sent_before_sent_train_400_eval_40_no_permute" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_before_sent_train_400_eval_40_no_permute
[ "region:us" ]
2023-11-23T06:10:57+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5869405.081244598, "num_examples": 4188}, {"name": "validation", "num_bytes": 232610, "num_examples": 200}], "download_size": 1125862, "dataset_size": 6102015.081244598}}
2023-11-23T06:11:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_before_sent_train_400_eval_40_no_permute" More Information needed
[ "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_no_permute\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_no_permute\"\n\nMore Information needed" ]
[ 6, 34 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40_no_permute\"\n\nMore Information needed" ]
3de547e11298e9a5e10ef564ca3a991876102d26
# Dataset Card for "find_sent_after_sent_train_400_eval_40_first_permute" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_after_sent_train_400_eval_40_first_permute
[ "region:us" ]
2023-11-23T06:11:44+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5866475.834053587, "num_examples": 4188}, {"name": "validation", "num_bytes": 232483, "num_examples": 200}], "download_size": 1246832, "dataset_size": 6098958.834053587}}
2023-11-23T06:11:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_after_sent_train_400_eval_40_first_permute" More Information needed
[ "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_first_permute\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_first_permute\"\n\nMore Information needed" ]
[ 6, 35 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40_first_permute\"\n\nMore Information needed" ]
a3b6b812a46726eef65a59d9e29948f1a3a321a9
# Dataset Card for "find_sent_after_sent_train_400_eval_40_last_permute" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_after_sent_train_400_eval_40_last_permute
[ "region:us" ]
2023-11-23T06:11:51+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5866475.834053587, "num_examples": 4188}, {"name": "validation", "num_bytes": 232483, "num_examples": 200}], "download_size": 1244208, "dataset_size": 6098958.834053587}}
2023-11-23T06:11:57+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_after_sent_train_400_eval_40_last_permute" More Information needed
[ "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_last_permute\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_last_permute\"\n\nMore Information needed" ]
[ 6, 34 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40_last_permute\"\n\nMore Information needed" ]
c4fd954fa62ee5afefe20fba8ee6ae599f812fbf
# Dataset Card for "find_sent_after_sent_train_400_eval_40_no_permute" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_after_sent_train_400_eval_40_no_permute
[ "region:us" ]
2023-11-23T06:11:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5866475.834053587, "num_examples": 4188}, {"name": "validation", "num_bytes": 232483, "num_examples": 200}], "download_size": 1126325, "dataset_size": 6098958.834053587}}
2023-11-23T06:12:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_after_sent_train_400_eval_40_no_permute" More Information needed
[ "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_no_permute\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_no_permute\"\n\nMore Information needed" ]
[ 6, 34 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40_no_permute\"\n\nMore Information needed" ]
5c9f6ee24993afb4dc20622c5eae5c89f0fd13c1
# Dataset Card for "tab-wnut-flat" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
madaanpulkit/tab-wnut-flat
[ "region:us" ]
2023-11-23T06:29:47+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "tokens", "sequence": "string"}, {"name": "token_spans", "sequence": {"sequence": "int64"}}, {"name": "tags", "sequence": {"class_label": {"names": {"0": "0", "1": "B-DIRECT-CODE", "2": "I-DIRECT-CODE", "3": "B-DIRECT-PERSON", "4": "I-DIRECT-PERSON", "5": "B-QUASI-DATETIME", "6": "I-QUASI-DATETIME", "7": "B-QUASI-PERSON", "8": "I-QUASI-PERSON", "9": "B-QUASI-LOC", "10": "I-QUASI-LOC", "11": "B-QUASI-QUANTITY", "12": "I-QUASI-QUANTITY", "13": "B-QUASI-CODE", "14": "I-QUASI-CODE", "15": "B-QUASI-ORG", "16": "I-QUASI-ORG", "17": "B-QUASI-DEM", "18": "I-QUASI-DEM", "19": "B-QUASI-MISC", "20": "I-QUASI-MISC", "21": "B-DIRECT-ORG", "22": "I-DIRECT-ORG", "23": "B-DIRECT-DATETIME", "24": "I-DIRECT-DATETIME", "25": "B-DIRECT-LOC", "26": "I-DIRECT-LOC", "27": "B-DIRECT-MISC", "28": "I-DIRECT-MISC", "29": "B-DIRECT-DEM", "30": "I-DIRECT-DEM"}}}}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 67834874, "num_examples": 1112}, {"name": "dev", "num_bytes": 19919192, "num_examples": 541}, {"name": "test", "num_bytes": 20147904, "num_examples": 555}], "download_size": 18198795, "dataset_size": 107901970}}
2023-12-01T17:14:45+00:00
[]
[]
TAGS #region-us
# Dataset Card for "tab-wnut-flat" More Information needed
[ "# Dataset Card for \"tab-wnut-flat\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"tab-wnut-flat\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"tab-wnut-flat\"\n\nMore Information needed" ]
cfecc8b1976190ee7fb551378751ab69592435fa
# Dataset Card for "capstone_fromgpt_without_gold_v4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Deojoandco/capstone_fromgpt_without_gold_v4
[ "region:us" ]
2023-11-23T06:52:27+00:00
{"dataset_info": {"features": [{"name": "dialog_id", "dtype": "int64"}, {"name": "dialogue", "dtype": "string"}, {"name": "summary", "dtype": "string"}, {"name": "gold_tags", "dtype": "string"}, {"name": "gpt_success", "dtype": "bool"}, {"name": "gpt_response", "dtype": "string"}, {"name": "gold_tags_tokens_count", "dtype": "int64"}, {"name": "GPT_TAGS_FOUND", "dtype": "bool"}, {"name": "gpt_output_tags", "dtype": "string"}, {"name": "gpt_output_tag_tokens_count", "dtype": "int64"}, {"name": "GPT_MI_FOUND", "dtype": "bool"}, {"name": "gpt_tags_token_count", "dtype": "int64"}, {"name": "gpt_tags", "dtype": "string"}, {"name": "tag_token_count_match", "dtype": "bool"}], "splits": [{"name": "test", "num_bytes": 20862, "num_examples": 12}], "download_size": 22567, "dataset_size": 20862}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]}
2023-11-23T06:52:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "capstone_fromgpt_without_gold_v4" More Information needed
[ "# Dataset Card for \"capstone_fromgpt_without_gold_v4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"capstone_fromgpt_without_gold_v4\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"capstone_fromgpt_without_gold_v4\"\n\nMore Information needed" ]
7517c57f2d49871fcd431ccc4b202c7c854e5be4
# Dataset Card for "c_x86_O0_exebench_json_cleaned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/c_x86_O0_exebench_json_cleaned
[ "region:us" ]
2023-11-23T07:03:44+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1501572452.4970698, "num_examples": 677306}], "download_size": 193236342, "dataset_size": 1501572452.4970698}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-01-17T12:07:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "c_x86_O0_exebench_json_cleaned" More Information needed
[ "# Dataset Card for \"c_x86_O0_exebench_json_cleaned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"c_x86_O0_exebench_json_cleaned\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"c_x86_O0_exebench_json_cleaned\"\n\nMore Information needed" ]
cad30bc1f6f66c65fb0011fbe96eccab132012f4
# Dataset Card for "squad_qa_title_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_title_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:04:08+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7956437, "num_examples": 5070}, {"name": "validation", "num_bytes": 413353, "num_examples": 300}], "download_size": 0, "dataset_size": 8369790}}
2023-11-23T07:19:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_title_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_title_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_title_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_title_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
da49b23b8c8d14b7ba1a5f026ee5fa981334c30d
# Dataset Card for "squad_qa_wrong_title_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_wrong_title_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:04:28+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "correct_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8097026, "num_examples": 5070}, {"name": "validation", "num_bytes": 422069, "num_examples": 300}], "download_size": 0, "dataset_size": 8519095}}
2023-11-23T07:20:18+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_wrong_title_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_wrong_title_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_wrong_title_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 31 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_wrong_title_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
92e0b3cd87c9635c4b1c03dc8beb83b51adce19d
# Dataset Card for "squad_qa_num_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_num_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:04:46+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7745401, "num_examples": 5070}, {"name": "validation", "num_bytes": 403389, "num_examples": 300}], "download_size": 0, "dataset_size": 8148790}}
2023-11-23T07:20:50+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_num_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_num_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_num_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_num_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
da667b3785723783fdb282e33a42f21957560711
# Dataset Card for "squad_qa_wrong_num_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_wrong_num_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:06:41+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "correct_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7801171, "num_examples": 5070}, {"name": "validation", "num_bytes": 406689, "num_examples": 300}], "download_size": 0, "dataset_size": 8207860}}
2023-11-23T07:21:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_wrong_num_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_wrong_num_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_wrong_num_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 31 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_wrong_num_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
88e742e48d98a48b2d090f4a89c739c4495a8404
# Dataset Card for "squad_qa_rare_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_rare_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:07:14+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7798024, "num_examples": 5070}, {"name": "validation", "num_bytes": 405531, "num_examples": 300}], "download_size": 0, "dataset_size": 8203555}}
2023-11-23T07:22:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_rare_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_rare_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_rare_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 28 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_rare_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
23b583e232582d06658adafb60f42c6db708682f
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_wrong_rare_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:07:49+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "correct_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7874354, "num_examples": 5070}, {"name": "validation", "num_bytes": 409972, "num_examples": 300}], "download_size": 0, "dataset_size": 8284326}}
2023-11-23T07:22:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_wrong_rare_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_wrong_rare_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 31 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_wrong_rare_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
1c3f1fedfc7b5bc2cefe10edb0bb08e7fa72c1ae
# Dataset Card for "squad_qa_baseline_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_baseline_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:08:22+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2996506, "num_examples": 2385}, {"name": "validation", "num_bytes": 395889, "num_examples": 300}], "download_size": 0, "dataset_size": 3392395}}
2023-11-23T07:23:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_baseline_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_baseline_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_baseline_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_baseline_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
984f62ab8963a09a8a66892e9193362318eeb8a4
# Dataset Card for "squad_qa_context_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_context_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:08:55+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4850217, "num_examples": 2385}, {"name": "validation", "num_bytes": 631113, "num_examples": 300}], "download_size": 0, "dataset_size": 5481330}}
2023-11-23T07:23:47+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_context_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_context_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_context_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_context_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
d5be65438b4ba879d7143f96397681e85f7b7cf8
# Dataset Card for "squad_qa_no_id_v5_full_recite_ans_sent" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_no_id_v5_full_recite_ans_sent
[ "region:us" ]
2023-11-23T07:09:28+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7874289, "num_examples": 5070}, {"name": "validation", "num_bytes": 402971, "num_examples": 300}], "download_size": 0, "dataset_size": 8277260}}
2023-11-23T07:24:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_no_id_v5_full_recite_ans_sent" More Information needed
[ "# Dataset Card for \"squad_qa_no_id_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_no_id_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
[ 6, 30 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_no_id_v5_full_recite_ans_sent\"\n\nMore Information needed" ]
529645a46fa2d7e28d82d8f64fc20488af621e78
# Dataset Card for "squad_qa_title_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_title_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:19:56+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9329162, "num_examples": 5070}, {"name": "validation", "num_bytes": 590772, "num_examples": 300}], "download_size": 1795168, "dataset_size": 9919934}}
2023-11-23T07:20:04+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_title_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_title_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_title_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_title_v5_full_recite_full_passage\"\n\nMore Information needed" ]
96d291f0b4e37206bc84555cd9f7b3d331f93dc4
# Dataset Card for "squad_qa_wrong_title_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_wrong_title_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:20:29+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "correct_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9469751, "num_examples": 5070}, {"name": "validation", "num_bytes": 599488, "num_examples": 300}], "download_size": 1873934, "dataset_size": 10069239}}
2023-11-23T07:20:37+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_wrong_title_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_wrong_title_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_wrong_title_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 32 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_wrong_title_v5_full_recite_full_passage\"\n\nMore Information needed" ]
0dd71ab685f6ef17f19cb5048307eae4f2d51e79
# Dataset Card for "squad_qa_num_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_num_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:21:01+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9118126, "num_examples": 5070}, {"name": "validation", "num_bytes": 580808, "num_examples": 300}], "download_size": 1769784, "dataset_size": 9698934}}
2023-11-23T07:21:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_num_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_num_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_num_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_num_v5_full_recite_full_passage\"\n\nMore Information needed" ]
d88a7f07239d84eb7a72f717598f84c5482acac3
# Dataset Card for "squad_qa_wrong_num_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_wrong_num_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:21:36+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "correct_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9173896, "num_examples": 5070}, {"name": "validation", "num_bytes": 584108, "num_examples": 300}], "download_size": 1807899, "dataset_size": 9758004}}
2023-11-23T07:21:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_wrong_num_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_wrong_num_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_wrong_num_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 32 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_wrong_num_v5_full_recite_full_passage\"\n\nMore Information needed" ]
47d45eaac8ab396463e9ee1690b160fe8329babe
# Dataset Card for "squad_qa_rare_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_rare_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:22:12+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9170749, "num_examples": 5070}, {"name": "validation", "num_bytes": 582950, "num_examples": 300}], "download_size": 1784741, "dataset_size": 9753699}}
2023-11-23T07:22:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_rare_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_rare_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_rare_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_rare_v5_full_recite_full_passage\"\n\nMore Information needed" ]
5c79e6285a263753bf43c2b3117b7079b15dc065
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_wrong_rare_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:22:49+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "correct_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9247079, "num_examples": 5070}, {"name": "validation", "num_bytes": 587391, "num_examples": 300}], "download_size": 1847562, "dataset_size": 9834470}}
2023-11-23T07:22:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_wrong_rare_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_wrong_rare_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 32 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_wrong_rare_v5_full_recite_full_passage\"\n\nMore Information needed" ]
c3814381ee6909843ec00ba724b3de2d5faf4dce
# Dataset Card for "squad_qa_baseline_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_baseline_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:23:23+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4369231, "num_examples": 2385}, {"name": "validation", "num_bytes": 573308, "num_examples": 300}], "download_size": 1012407, "dataset_size": 4942539}}
2023-11-23T07:23:32+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_baseline_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_baseline_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_baseline_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 30 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_baseline_v5_full_recite_full_passage\"\n\nMore Information needed" ]
1a4cec051a2151c3d38eb4f369b53588f26e0cf2
# Dataset Card for "squad_qa_context_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_context_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:23:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6222942, "num_examples": 2385}, {"name": "validation", "num_bytes": 808532, "num_examples": 300}], "download_size": 1374285, "dataset_size": 7031474}}
2023-11-23T07:24:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_context_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_context_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_context_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 30 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_context_v5_full_recite_full_passage\"\n\nMore Information needed" ]
bbed60fc3029ea7cf78cc2723f23f601d5ba4b19
# Dataset Card for "squad_qa_no_id_v5_full_recite_full_passage" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/squad_qa_no_id_v5_full_recite_full_passage
[ "region:us" ]
2023-11-23T07:24:32+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "answer", "dtype": "string"}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9247014, "num_examples": 5070}, {"name": "validation", "num_bytes": 580390, "num_examples": 300}], "download_size": 1781909, "dataset_size": 9827404}}
2023-11-23T07:24:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "squad_qa_no_id_v5_full_recite_full_passage" More Information needed
[ "# Dataset Card for \"squad_qa_no_id_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"squad_qa_no_id_v5_full_recite_full_passage\"\n\nMore Information needed" ]
[ 6, 31 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"squad_qa_no_id_v5_full_recite_full_passage\"\n\nMore Information needed" ]
67e0a8eacd05f6784c37e50fafe04373cbfa6345
# Dataset Card for "capstone_fromgpt_without_gold_v5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Deojoandco/capstone_fromgpt_without_gold_v5
[ "region:us" ]
2023-11-23T07:31:12+00:00
{"dataset_info": {"features": [{"name": "dialog_id", "dtype": "int64"}, {"name": "dialogue", "dtype": "string"}, {"name": "summary", "dtype": "string"}, {"name": "gold_tags", "dtype": "string"}, {"name": "gpt_success", "dtype": "bool"}, {"name": "gpt_response", "dtype": "string"}, {"name": "gold_tags_tokens_count", "dtype": "int64"}, {"name": "GPT_TAGS_FOUND", "dtype": "bool"}, {"name": "gpt_output_tags", "dtype": "string"}, {"name": "gpt_output_tag_tokens_count", "dtype": "int64"}, {"name": "GPT_MI_FOUND", "dtype": "bool"}, {"name": "gpt_tags_token_count", "dtype": "int64"}, {"name": "gpt_tags", "dtype": "string"}, {"name": "tag_token_count_match", "dtype": "bool"}], "splits": [{"name": "test", "num_bytes": 19859, "num_examples": 12}], "download_size": 21426, "dataset_size": 19859}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]}
2023-11-23T07:31:15+00:00
[]
[]
TAGS #region-us
# Dataset Card for "capstone_fromgpt_without_gold_v5" More Information needed
[ "# Dataset Card for \"capstone_fromgpt_without_gold_v5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"capstone_fromgpt_without_gold_v5\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"capstone_fromgpt_without_gold_v5\"\n\nMore Information needed" ]
7b6ef938dc694f240ca1efc40f6a8c4e271b8f94
# Dataset Card for "zalo-crawler-v17-explanation" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
phucnn/zalo-crawler-v17-explanation
[ "region:us" ]
2023-11-23T07:53:43+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "explanation", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 74258299, "num_examples": 103531}], "download_size": 27978556, "dataset_size": 74258299}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T07:58:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "zalo-crawler-v17-explanation" More Information needed
[ "# Dataset Card for \"zalo-crawler-v17-explanation\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"zalo-crawler-v17-explanation\"\n\nMore Information needed" ]
[ 6, 23 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"zalo-crawler-v17-explanation\"\n\nMore Information needed" ]
ef245e0956e602aa353c678ca694cc8f1c308f4f
# Dataset Card for "capstone_fromgpt_without_gold_v6" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Deojoandco/capstone_fromgpt_without_gold_v6
[ "region:us" ]
2023-11-23T07:53:47+00:00
{"dataset_info": {"features": [{"name": "dialog_id", "dtype": "int64"}, {"name": "dialogue", "dtype": "string"}, {"name": "summary", "dtype": "string"}, {"name": "gold_tags", "dtype": "string"}, {"name": "gpt_success", "dtype": "bool"}, {"name": "gpt_response", "dtype": "string"}, {"name": "gold_tags_tokens_count", "dtype": "int64"}, {"name": "GPT_TAGS_FOUND", "dtype": "bool"}, {"name": "gpt_output_tags", "dtype": "string"}, {"name": "gpt_output_tag_tokens_count", "dtype": "int64"}, {"name": "GPT_MI_FOUND", "dtype": "bool"}, {"name": "gpt_tags_token_count", "dtype": "int64"}, {"name": "gpt_tags", "dtype": "string"}, {"name": "tag_token_count_match", "dtype": "bool"}], "splits": [{"name": "test", "num_bytes": 20174, "num_examples": 12}], "download_size": 21461, "dataset_size": 20174}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}]}
2023-11-23T07:53:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for "capstone_fromgpt_without_gold_v6" More Information needed
[ "# Dataset Card for \"capstone_fromgpt_without_gold_v6\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"capstone_fromgpt_without_gold_v6\"\n\nMore Information needed" ]
[ 6, 24 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"capstone_fromgpt_without_gold_v6\"\n\nMore Information needed" ]
2cf7cbd8b17b2a538667f143b338fe0bf7bf43c8
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_after_sent_train_400_eval_40_random_permute_2
[ "region:us" ]
2023-11-23T08:21:49+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3715966.658418829, "num_examples": 2874}, {"name": "validation", "num_bytes": 232483, "num_examples": 200}], "download_size": 1053957, "dataset_size": 3948449.658418829}}
2023-11-23T08:46:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_2" More Information needed
[ "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
[ 6, 37 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
8c58260d2eba273e44917772bc675a1d945401b9
# Dataset Card for Evaluation run of microsoft/Orca-2-7b ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/microsoft/Orca-2-7b - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [microsoft/Orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_microsoft__Orca-2-7b_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-23T08:52:22.157398](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-7b_public/blob/main/results_2023-11-23T08-52-22.157398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5591515182783672, "acc_stderr": 0.03362651811696442, "acc_norm": 0.5666849678033645, "acc_norm_stderr": 0.03437864006901342, "mc1": 0.3684210526315789, "mc1_stderr": 0.016886551261046046, "mc2": 0.5244663206388774, "mc2_stderr": 0.016012530609803507, "em": 0.3205746644295302, "em_stderr": 0.004779419137797957, "f1": 0.43866505872483647, "f1_stderr": 0.004557698070527672 }, "harness|arc:challenge|25": { "acc": 0.5119453924914675, "acc_stderr": 0.014607220340597171, "acc_norm": 0.5409556313993175, "acc_norm_stderr": 0.01456229107360123 }, "harness|hellaswag|10": { "acc": 0.5828520215096594, "acc_stderr": 0.004920800313232742, "acc_norm": 0.7619000199163514, "acc_norm_stderr": 0.004250501643743773 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206824, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206824 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353228, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353228 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.59, "acc_stderr": 0.04943110704237102, "acc_norm": 0.59, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6150943396226415, "acc_stderr": 0.02994649856769995, "acc_norm": 0.6150943396226415, "acc_norm_stderr": 0.02994649856769995 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5972222222222222, "acc_stderr": 0.04101405519842426, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.04101405519842426 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.44, "acc_stderr": 0.049888765156985884, "acc_norm": 0.44, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5260115606936416, "acc_stderr": 0.03807301726504513, "acc_norm": 0.5260115606936416, "acc_norm_stderr": 0.03807301726504513 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.28431372549019607, "acc_stderr": 0.04488482852329017, "acc_norm": 0.28431372549019607, "acc_norm_stderr": 0.04488482852329017 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4765957446808511, "acc_stderr": 0.032650194750335815, "acc_norm": 0.4765957446808511, "acc_norm_stderr": 0.032650194750335815 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.04266339443159394, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.04266339443159394 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.496551724137931, "acc_stderr": 0.04166567577101579, "acc_norm": 0.496551724137931, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35714285714285715, "acc_stderr": 0.024677862841332783, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.024677862841332783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949097, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949097 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6161290322580645, "acc_stderr": 0.02766618207553964, "acc_norm": 0.6161290322580645, "acc_norm_stderr": 0.02766618207553964 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4039408866995074, "acc_stderr": 0.03452453903822039, "acc_norm": 0.4039408866995074, "acc_norm_stderr": 0.03452453903822039 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7393939393939394, "acc_stderr": 0.034277431758165236, "acc_norm": 0.7393939393939394, "acc_norm_stderr": 0.034277431758165236 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03173071239071724, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03173071239071724 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8290155440414507, "acc_stderr": 0.02717121368316453, "acc_norm": 0.8290155440414507, "acc_norm_stderr": 0.02717121368316453 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5282051282051282, "acc_stderr": 0.025310639254933882, "acc_norm": 0.5282051282051282, "acc_norm_stderr": 0.025310639254933882 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028597, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028597 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5210084033613446, "acc_stderr": 0.03244980849990029, "acc_norm": 0.5210084033613446, "acc_norm_stderr": 0.03244980849990029 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7504587155963303, "acc_stderr": 0.018553897629501628, "acc_norm": 0.7504587155963303, "acc_norm_stderr": 0.018553897629501628 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.375, "acc_stderr": 0.033016908987210894, "acc_norm": 0.375, "acc_norm_stderr": 0.033016908987210894 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7450980392156863, "acc_stderr": 0.030587591351604246, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.030587591351604246 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.02747974455080851, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.02747974455080851 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6457399103139013, "acc_stderr": 0.032100621541349864, "acc_norm": 0.6457399103139013, "acc_norm_stderr": 0.032100621541349864 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.732824427480916, "acc_stderr": 0.03880848301082396, "acc_norm": 0.732824427480916, "acc_norm_stderr": 0.03880848301082396 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6859504132231405, "acc_stderr": 0.042369647530410184, "acc_norm": 0.6859504132231405, "acc_norm_stderr": 0.042369647530410184 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04557239513497751, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04557239513497751 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6441717791411042, "acc_stderr": 0.03761521380046734, "acc_norm": 0.6441717791411042, "acc_norm_stderr": 0.03761521380046734 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8376068376068376, "acc_stderr": 0.02416161812798774, "acc_norm": 0.8376068376068376, "acc_norm_stderr": 0.02416161812798774 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.756066411238825, "acc_stderr": 0.015357212665829468, "acc_norm": 0.756066411238825, "acc_norm_stderr": 0.015357212665829468 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6416184971098265, "acc_stderr": 0.025816756791584183, "acc_norm": 0.6416184971098265, "acc_norm_stderr": 0.025816756791584183 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34413407821229053, "acc_stderr": 0.015889221313307094, "acc_norm": 0.34413407821229053, "acc_norm_stderr": 0.015889221313307094 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6241830065359477, "acc_stderr": 0.02773283435336394, "acc_norm": 0.6241830065359477, "acc_norm_stderr": 0.02773283435336394 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.617363344051447, "acc_stderr": 0.027604689028581986, "acc_norm": 0.617363344051447, "acc_norm_stderr": 0.027604689028581986 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.654320987654321, "acc_stderr": 0.026462487777001872, "acc_norm": 0.654320987654321, "acc_norm_stderr": 0.026462487777001872 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.39361702127659576, "acc_stderr": 0.029144544781596154, "acc_norm": 0.39361702127659576, "acc_norm_stderr": 0.029144544781596154 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.408735332464146, "acc_stderr": 0.012555701346703385, "acc_norm": 0.408735332464146, "acc_norm_stderr": 0.012555701346703385 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5183823529411765, "acc_stderr": 0.030352303395351964, "acc_norm": 0.5183823529411765, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5441176470588235, "acc_stderr": 0.020148939420415745, "acc_norm": 0.5441176470588235, "acc_norm_stderr": 0.020148939420415745 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.636734693877551, "acc_stderr": 0.030789051139030806, "acc_norm": 0.636734693877551, "acc_norm_stderr": 0.030789051139030806 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6716417910447762, "acc_stderr": 0.033206858897443244, "acc_norm": 0.6716417910447762, "acc_norm_stderr": 0.033206858897443244 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.78, "acc_stderr": 0.041633319989322626, "acc_norm": 0.78, "acc_norm_stderr": 0.041633319989322626 }, "harness|hendrycksTest-virology|5": { "acc": 0.4819277108433735, "acc_stderr": 0.038899512528272166, "acc_norm": 0.4819277108433735, "acc_norm_stderr": 0.038899512528272166 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.03274485211946956, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.03274485211946956 }, "harness|truthfulqa:mc|0": { "mc1": 0.3684210526315789, "mc1_stderr": 0.016886551261046046, "mc2": 0.5244663206388774, "mc2_stderr": 0.016012530609803507 }, "harness|winogrande|5": { "acc": 0.7348066298342542, "acc_stderr": 0.01240654946619286 }, "harness|drop|3": { "em": 0.3205746644295302, "em_stderr": 0.004779419137797957, "f1": 0.43866505872483647, "f1_stderr": 0.004557698070527672 }, "harness|gsm8k|5": { "acc": 0.1470811220621683, "acc_stderr": 0.009756063660359875 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_microsoft__Orca-2-7b
[ "region:us" ]
2023-11-23T08:28:19+00:00
{"pretty_name": "Evaluation run of microsoft/Orca-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [microsoft/Orca-2-7b](https://huggingface.co/microsoft/Orca-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__Orca-2-7b_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-23T08:52:22.157398](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-7b_public/blob/main/results_2023-11-23T08-52-22.157398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5591515182783672,\n \"acc_stderr\": 0.03362651811696442,\n \"acc_norm\": 0.5666849678033645,\n \"acc_norm_stderr\": 0.03437864006901342,\n \"mc1\": 0.3684210526315789,\n \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5244663206388774,\n \"mc2_stderr\": 0.016012530609803507,\n \"em\": 0.3205746644295302,\n \"em_stderr\": 0.004779419137797957,\n \"f1\": 0.43866505872483647,\n \"f1_stderr\": 0.004557698070527672\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597171,\n \"acc_norm\": 0.5409556313993175,\n \"acc_norm_stderr\": 0.01456229107360123\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5828520215096594,\n \"acc_stderr\": 0.004920800313232742,\n \"acc_norm\": 0.7619000199163514,\n \"acc_norm_stderr\": 0.004250501643743773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n \"acc_stderr\": 0.02766618207553964,\n \"acc_norm\": 0.6161290322580645,\n \"acc_norm_stderr\": 0.02766618207553964\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.03452453903822039,\n \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.03452453903822039\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933882,\n \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933882\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.042369647530410184,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.042369647530410184\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n \"acc_stderr\": 0.015357212665829468,\n \"acc_norm\": 0.756066411238825,\n \"acc_norm_stderr\": 0.015357212665829468\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584183,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n \"acc_stderr\": 0.027604689028581986,\n \"acc_norm\": 0.617363344051447,\n \"acc_norm_stderr\": 0.027604689028581986\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001872,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001872\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596154,\n \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596154\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.408735332464146,\n \"acc_stderr\": 0.012555701346703385,\n \"acc_norm\": 0.408735332464146,\n \"acc_norm_stderr\": 0.012555701346703385\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5244663206388774,\n \"mc2_stderr\": 0.016012530609803507\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n },\n \"harness|drop|3\": {\n \"em\": 0.3205746644295302,\n \"em_stderr\": 0.004779419137797957,\n \"f1\": 0.43866505872483647,\n \"f1_stderr\": 0.004557698070527672\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1470811220621683,\n \"acc_stderr\": 0.009756063660359875\n }\n}\n```", "repo_url": "https://huggingface.co/microsoft/Orca-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|arc:challenge|25_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|arc:challenge|25_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|drop|3_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|drop|3_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|gsm8k|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|gsm8k|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hellaswag|10_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hellaswag|10_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T08-25-14.186190.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T08-52-22.157398.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["**/details_harness|winogrande|5_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["**/details_harness|winogrande|5_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-23T08-52-22.157398.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_23T08_25_14.186190", "path": ["results_2023-11-23T08-25-14.186190.parquet"]}, {"split": "2023_11_23T08_52_22.157398", "path": ["results_2023-11-23T08-52-22.157398.parquet"]}, {"split": "latest", "path": ["results_2023-11-23T08-52-22.157398.parquet"]}]}]}
2023-11-23T08:56:11+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of microsoft/Orca-2-7b ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model microsoft/Orca-2-7b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-11-23T08:52:22.157398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of microsoft/Orca-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/Orca-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-23T08:52:22.157398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of microsoft/Orca-2-7b", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/Orca-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-23T08:52:22.157398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 17, 31, 166, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of microsoft/Orca-2-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model microsoft/Orca-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-23T08:52:22.157398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
c175ec64e4d2fd604a32060d02867c1835883314
# Dataset Card for Evaluation run of migtissera/Tess-XS-v1.1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/migtissera/Tess-XS-v1.1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [migtissera/Tess-XS-v1.1](https://huggingface.co/migtissera/Tess-XS-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-XS-v1.1_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-23T08:39:10.846213](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-XS-v1.1_public/blob/main/results_2023-11-23T08-39-10.846213.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6253362884117736, "acc_stderr": 0.03254975101958803, "acc_norm": 0.6343561981840767, "acc_norm_stderr": 0.0332634036672251, "mc1": 0.3463892288861689, "mc1_stderr": 0.01665699710912514, "mc2": 0.49923681207340576, "mc2_stderr": 0.01551504317540587, "em": 0.18278104026845637, "em_stderr": 0.003957987703151033, "f1": 0.27069211409396043, "f1_stderr": 0.004030013722161818 }, "harness|arc:challenge|25": { "acc": 0.5930034129692833, "acc_stderr": 0.014356399418009126, "acc_norm": 0.6390784982935154, "acc_norm_stderr": 0.014034761386175452 }, "harness|hellaswag|10": { "acc": 0.6512646883091018, "acc_stderr": 0.004755960559929163, "acc_norm": 0.8405696076478789, "acc_norm_stderr": 0.003653288043555801 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.049020713000019756, "acc_norm": 0.61, "acc_norm_stderr": 0.049020713000019756 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6944444444444444, "acc_stderr": 0.03852084696008534, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.03852084696008534 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145633, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145633 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6242774566473989, "acc_stderr": 0.036928207672648664, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.036928207672648664 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.047840607041056527, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.047840607041056527 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5404255319148936, "acc_stderr": 0.03257901482099835, "acc_norm": 0.5404255319148936, "acc_norm_stderr": 0.03257901482099835 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.45614035087719296, "acc_stderr": 0.046854730419077895, "acc_norm": 0.45614035087719296, "acc_norm_stderr": 0.046854730419077895 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.02535574126305527, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.02535574126305527 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.04451807959055328, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.04451807959055328 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7516129032258064, "acc_stderr": 0.024580028921481003, "acc_norm": 0.7516129032258064, "acc_norm_stderr": 0.024580028921481003 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.033744026441394036, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.033744026441394036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586808, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586808 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8756476683937824, "acc_stderr": 0.023814477086593542, "acc_norm": 0.8756476683937824, "acc_norm_stderr": 0.023814477086593542 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563976, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563976 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.02995382389188704, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.02995382389188704 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8220183486238533, "acc_stderr": 0.016399436366612917, "acc_norm": 0.8220183486238533, "acc_norm_stderr": 0.016399436366612917 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.026756401538078962, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.026756401538078962 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7721518987341772, "acc_stderr": 0.027303484599069425, "acc_norm": 0.7721518987341772, "acc_norm_stderr": 0.027303484599069425 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306086, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306086 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507332, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507332 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368985, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368985 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7138728323699421, "acc_stderr": 0.02433214677913413, "acc_norm": 0.7138728323699421, "acc_norm_stderr": 0.02433214677913413 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.36312849162011174, "acc_stderr": 0.016083749986853697, "acc_norm": 0.36312849162011174, "acc_norm_stderr": 0.016083749986853697 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7320261437908496, "acc_stderr": 0.025360603796242557, "acc_norm": 0.7320261437908496, "acc_norm_stderr": 0.025360603796242557 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.025494259350694912, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.025494259350694912 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7283950617283951, "acc_stderr": 0.024748624490537368, "acc_norm": 0.7283950617283951, "acc_norm_stderr": 0.024748624490537368 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.44680851063829785, "acc_stderr": 0.029658235097666907, "acc_norm": 0.44680851063829785, "acc_norm_stderr": 0.029658235097666907 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4706649282920469, "acc_stderr": 0.012748238397365549, "acc_norm": 0.4706649282920469, "acc_norm_stderr": 0.012748238397365549 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6617647058823529, "acc_stderr": 0.028739328513983576, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.028739328513983576 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6633986928104575, "acc_stderr": 0.019117213911495155, "acc_norm": 0.6633986928104575, "acc_norm_stderr": 0.019117213911495155 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7061224489795919, "acc_stderr": 0.02916273841024977, "acc_norm": 0.7061224489795919, "acc_norm_stderr": 0.02916273841024977 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616914, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616914 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.82, "acc_stderr": 0.03861229196653694, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7894736842105263, "acc_stderr": 0.031267817146631786, "acc_norm": 0.7894736842105263, "acc_norm_stderr": 0.031267817146631786 }, "harness|truthfulqa:mc|0": { "mc1": 0.3463892288861689, "mc1_stderr": 0.01665699710912514, "mc2": 0.49923681207340576, "mc2_stderr": 0.01551504317540587 }, "harness|winogrande|5": { "acc": 0.7916337805840569, "acc_stderr": 0.011414554399987726 }, "harness|drop|3": { "em": 0.18278104026845637, "em_stderr": 0.003957987703151033, "f1": 0.27069211409396043, "f1_stderr": 0.004030013722161818 }, "harness|gsm8k|5": { "acc": 0.16224412433661864, "acc_stderr": 0.010155130880393524 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_migtissera__Tess-XS-v1.1
[ "region:us" ]
2023-11-23T08:38:10+00:00
{"pretty_name": "Evaluation run of migtissera/Tess-XS-v1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Tess-XS-v1.1](https://huggingface.co/migtissera/Tess-XS-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-XS-v1.1_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-23T08:39:10.846213](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-XS-v1.1_public/blob/main/results_2023-11-23T08-39-10.846213.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6253362884117736,\n \"acc_stderr\": 0.03254975101958803,\n \"acc_norm\": 0.6343561981840767,\n \"acc_norm_stderr\": 0.0332634036672251,\n \"mc1\": 0.3463892288861689,\n \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.49923681207340576,\n \"mc2_stderr\": 0.01551504317540587,\n \"em\": 0.18278104026845637,\n \"em_stderr\": 0.003957987703151033,\n \"f1\": 0.27069211409396043,\n \"f1_stderr\": 0.004030013722161818\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009126,\n \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6512646883091018,\n \"acc_stderr\": 0.004755960559929163,\n \"acc_norm\": 0.8405696076478789,\n \"acc_norm_stderr\": 0.003653288043555801\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612917,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612917\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983576,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983576\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.49923681207340576,\n \"mc2_stderr\": 0.01551504317540587\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987726\n },\n \"harness|drop|3\": {\n \"em\": 0.18278104026845637,\n \"em_stderr\": 0.003957987703151033,\n \"f1\": 0.27069211409396043,\n \"f1_stderr\": 0.004030013722161818\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16224412433661864,\n \"acc_stderr\": 0.010155130880393524\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Tess-XS-v1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|arc:challenge|25_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|arc:challenge|25_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|drop|3_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|drop|3_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|gsm8k|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|gsm8k|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hellaswag|10_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hellaswag|10_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T08-35-10.663595.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T08-39-10.846213.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["**/details_harness|winogrande|5_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["**/details_harness|winogrande|5_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-23T08-39-10.846213.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_23T08_35_10.663595", "path": ["results_2023-11-23T08-35-10.663595.parquet"]}, {"split": "2023_11_23T08_39_10.846213", "path": ["results_2023-11-23T08-39-10.846213.parquet"]}, {"split": "latest", "path": ["results_2023-11-23T08-39-10.846213.parquet"]}]}]}
2023-11-23T08:42:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of migtissera/Tess-XS-v1.1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model migtissera/Tess-XS-v1.1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-11-23T08:39:10.846213(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of migtissera/Tess-XS-v1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-XS-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-23T08:39:10.846213(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of migtissera/Tess-XS-v1.1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-XS-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-23T08:39:10.846213(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of migtissera/Tess-XS-v1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Tess-XS-v1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-23T08:39:10.846213(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
b2cd3668f2afc67585125c50f9b109008692c0fe
# Dataset Card for Evaluation run of 922-CA/monika-ddlc-7b-v1 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/922-CA/monika-ddlc-7b-v1 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [922-CA/monika-ddlc-7b-v1](https://huggingface.co/922-CA/monika-ddlc-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_922-CA__monika-ddlc-7b-v1_public", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-11-23T09:14:32.105444](https://huggingface.co/datasets/open-llm-leaderboard/details_922-CA__monika-ddlc-7b-v1_public/blob/main/results_2023-11-23T09-14-32.105444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.45724106931691777, "acc_stderr": 0.03438888001157538, "acc_norm": 0.462960301316153, "acc_norm_stderr": 0.03519719392287989, "mc1": 0.28151774785801714, "mc1_stderr": 0.01574402724825605, "mc2": 0.43943952364740935, "mc2_stderr": 0.014972022232931708, "em": 0.014786073825503355, "em_stderr": 0.0012360366760473, "f1": 0.07986682046979873, "f1_stderr": 0.0018932315277158172 }, "harness|arc:challenge|25": { "acc": 0.5, "acc_stderr": 0.014611390804670088, "acc_norm": 0.5494880546075085, "acc_norm_stderr": 0.014539646098471627 }, "harness|hellaswag|10": { "acc": 0.5778729336785501, "acc_stderr": 0.004928891895874295, "acc_norm": 0.7677753435570603, "acc_norm_stderr": 0.004213885798268836 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.3925925925925926, "acc_stderr": 0.04218506215368879, "acc_norm": 0.3925925925925926, "acc_norm_stderr": 0.04218506215368879 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.45394736842105265, "acc_stderr": 0.04051646342874142, "acc_norm": 0.45394736842105265, "acc_norm_stderr": 0.04051646342874142 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5132075471698113, "acc_stderr": 0.030762134874500482, "acc_norm": 0.5132075471698113, "acc_norm_stderr": 0.030762134874500482 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4861111111111111, "acc_stderr": 0.04179596617581, "acc_norm": 0.4861111111111111, "acc_norm_stderr": 0.04179596617581 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.3815028901734104, "acc_stderr": 0.03703851193099521, "acc_norm": 0.3815028901734104, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.24509803921568626, "acc_stderr": 0.04280105837364396, "acc_norm": 0.24509803921568626, "acc_norm_stderr": 0.04280105837364396 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.41702127659574467, "acc_stderr": 0.03223276266711712, "acc_norm": 0.41702127659574467, "acc_norm_stderr": 0.03223276266711712 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.32456140350877194, "acc_stderr": 0.04404556157374767, "acc_norm": 0.32456140350877194, "acc_norm_stderr": 0.04404556157374767 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.43448275862068964, "acc_stderr": 0.04130740879555497, "acc_norm": 0.43448275862068964, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.29365079365079366, "acc_stderr": 0.023456037383982026, "acc_norm": 0.29365079365079366, "acc_norm_stderr": 0.023456037383982026 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.1984126984126984, "acc_stderr": 0.03567016675276865, "acc_norm": 0.1984126984126984, "acc_norm_stderr": 0.03567016675276865 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5290322580645161, "acc_stderr": 0.028396016402761005, "acc_norm": 0.5290322580645161, "acc_norm_stderr": 0.028396016402761005 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3497536945812808, "acc_stderr": 0.03355400904969566, "acc_norm": 0.3497536945812808, "acc_norm_stderr": 0.03355400904969566 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5636363636363636, "acc_stderr": 0.03872592983524754, "acc_norm": 0.5636363636363636, "acc_norm_stderr": 0.03872592983524754 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.601010101010101, "acc_stderr": 0.034889016168527326, "acc_norm": 0.601010101010101, "acc_norm_stderr": 0.034889016168527326 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6217616580310881, "acc_stderr": 0.03499807276193338, "acc_norm": 0.6217616580310881, "acc_norm_stderr": 0.03499807276193338 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.39487179487179486, "acc_stderr": 0.02478431694215638, "acc_norm": 0.39487179487179486, "acc_norm_stderr": 0.02478431694215638 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24074074074074073, "acc_stderr": 0.026067159222275794, "acc_norm": 0.24074074074074073, "acc_norm_stderr": 0.026067159222275794 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.031968769891957786, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.031968769891957786 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.03802039760107903, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.03802039760107903 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6073394495412844, "acc_stderr": 0.020937505161201093, "acc_norm": 0.6073394495412844, "acc_norm_stderr": 0.020937505161201093 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.3101851851851852, "acc_stderr": 0.03154696285656629, "acc_norm": 0.3101851851851852, "acc_norm_stderr": 0.03154696285656629 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6372549019607843, "acc_stderr": 0.03374499356319354, "acc_norm": 0.6372549019607843, "acc_norm_stderr": 0.03374499356319354 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6329113924050633, "acc_stderr": 0.03137624072561618, "acc_norm": 0.6329113924050633, "acc_norm_stderr": 0.03137624072561618 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5201793721973094, "acc_stderr": 0.033530461674123, "acc_norm": 0.5201793721973094, "acc_norm_stderr": 0.033530461674123 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5114503816793893, "acc_stderr": 0.04384140024078016, "acc_norm": 0.5114503816793893, "acc_norm_stderr": 0.04384140024078016 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6115702479338843, "acc_stderr": 0.04449270350068383, "acc_norm": 0.6115702479338843, "acc_norm_stderr": 0.04449270350068383 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5648148148148148, "acc_stderr": 0.04792898170907061, "acc_norm": 0.5648148148148148, "acc_norm_stderr": 0.04792898170907061 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5276073619631901, "acc_stderr": 0.0392237829061099, "acc_norm": 0.5276073619631901, "acc_norm_stderr": 0.0392237829061099 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.5922330097087378, "acc_stderr": 0.048657775704107696, "acc_norm": 0.5922330097087378, "acc_norm_stderr": 0.048657775704107696 }, "harness|hendrycksTest-marketing|5": { "acc": 0.688034188034188, "acc_stderr": 0.030351527323344927, "acc_norm": 0.688034188034188, "acc_norm_stderr": 0.030351527323344927 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6538952745849298, "acc_stderr": 0.01701196526641207, "acc_norm": 0.6538952745849298, "acc_norm_stderr": 0.01701196526641207 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5202312138728323, "acc_stderr": 0.026897049996382875, "acc_norm": 0.5202312138728323, "acc_norm_stderr": 0.026897049996382875 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2245810055865922, "acc_stderr": 0.01395680366654464, "acc_norm": 0.2245810055865922, "acc_norm_stderr": 0.01395680366654464 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.49019607843137253, "acc_stderr": 0.02862441255016795, "acc_norm": 0.49019607843137253, "acc_norm_stderr": 0.02862441255016795 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5369774919614148, "acc_stderr": 0.028320325830105915, "acc_norm": 0.5369774919614148, "acc_norm_stderr": 0.028320325830105915 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5339506172839507, "acc_stderr": 0.027756535257347666, "acc_norm": 0.5339506172839507, "acc_norm_stderr": 0.027756535257347666 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3723404255319149, "acc_stderr": 0.028838921471251458, "acc_norm": 0.3723404255319149, "acc_norm_stderr": 0.028838921471251458 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.32659713168187743, "acc_stderr": 0.011977676704715999, "acc_norm": 0.32659713168187743, "acc_norm_stderr": 0.011977676704715999 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.38235294117647056, "acc_stderr": 0.02952009569768776, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.02952009569768776 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.434640522875817, "acc_stderr": 0.020054269200726463, "acc_norm": 0.434640522875817, "acc_norm_stderr": 0.020054269200726463 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5181818181818182, "acc_stderr": 0.04785964010794915, "acc_norm": 0.5181818181818182, "acc_norm_stderr": 0.04785964010794915 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5142857142857142, "acc_stderr": 0.03199615232806287, "acc_norm": 0.5142857142857142, "acc_norm_stderr": 0.03199615232806287 }, "harness|hendrycksTest-sociology|5": { "acc": 0.5572139303482587, "acc_stderr": 0.03512310964123937, "acc_norm": 0.5572139303482587, "acc_norm_stderr": 0.03512310964123937 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.65, "acc_stderr": 0.047937248544110196, "acc_norm": 0.65, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-virology|5": { "acc": 0.41566265060240964, "acc_stderr": 0.03836722176598052, "acc_norm": 0.41566265060240964, "acc_norm_stderr": 0.03836722176598052 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6666666666666666, "acc_stderr": 0.03615507630310935, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.03615507630310935 }, "harness|truthfulqa:mc|0": { "mc1": 0.28151774785801714, "mc1_stderr": 0.01574402724825605, "mc2": 0.43943952364740935, "mc2_stderr": 0.014972022232931708 }, "harness|winogrande|5": { "acc": 0.728492501973165, "acc_stderr": 0.012499326254893129 }, "harness|drop|3": { "em": 0.014786073825503355, "em_stderr": 0.0012360366760473, "f1": 0.07986682046979873, "f1_stderr": 0.0018932315277158172 }, "harness|gsm8k|5": { "acc": 0.08794541319181198, "acc_stderr": 0.007801162197487717 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_922-CA__monika-ddlc-7b-v1
[ "region:us" ]
2023-11-23T08:44:19+00:00
{"pretty_name": "Evaluation run of 922-CA/monika-ddlc-7b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [922-CA/monika-ddlc-7b-v1](https://huggingface.co/922-CA/monika-ddlc-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_922-CA__monika-ddlc-7b-v1_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-23T09:14:32.105444](https://huggingface.co/datasets/open-llm-leaderboard/details_922-CA__monika-ddlc-7b-v1_public/blob/main/results_2023-11-23T09-14-32.105444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45724106931691777,\n \"acc_stderr\": 0.03438888001157538,\n \"acc_norm\": 0.462960301316153,\n \"acc_norm_stderr\": 0.03519719392287989,\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.43943952364740935,\n \"mc2_stderr\": 0.014972022232931708,\n \"em\": 0.014786073825503355,\n \"em_stderr\": 0.0012360366760473,\n \"f1\": 0.07986682046979873,\n \"f1_stderr\": 0.0018932315277158172\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \"acc_norm\": 0.5494880546075085,\n \"acc_norm_stderr\": 0.014539646098471627\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5778729336785501,\n \"acc_stderr\": 0.004928891895874295,\n \"acc_norm\": 0.7677753435570603,\n \"acc_norm_stderr\": 0.004213885798268836\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874142,\n \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874142\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500482,\n \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500482\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.3815028901734104,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364396,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364396\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982026,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982026\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n \"acc_stderr\": 0.03567016675276865,\n \"acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.03567016675276865\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.5290322580645161,\n \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.601010101010101,\n \"acc_stderr\": 0.034889016168527326,\n \"acc_norm\": 0.601010101010101,\n \"acc_norm_stderr\": 0.034889016168527326\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6217616580310881,\n \"acc_stderr\": 0.03499807276193338,\n \"acc_norm\": 0.6217616580310881,\n \"acc_norm_stderr\": 0.03499807276193338\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.39487179487179486,\n \"acc_stderr\": 0.02478431694215638,\n \"acc_norm\": 0.39487179487179486,\n \"acc_norm_stderr\": 0.02478431694215638\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6073394495412844,\n \"acc_stderr\": 0.020937505161201093,\n \"acc_norm\": 0.6073394495412844,\n \"acc_norm_stderr\": 0.020937505161201093\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3101851851851852,\n \"acc_stderr\": 0.03154696285656629,\n \"acc_norm\": 0.3101851851851852,\n \"acc_norm_stderr\": 0.03154696285656629\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319354,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319354\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6329113924050633,\n \"acc_stderr\": 0.03137624072561618,\n \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.03137624072561618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5201793721973094,\n \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.5201793721973094,\n \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5922330097087378,\n \"acc_stderr\": 0.048657775704107696,\n \"acc_norm\": 0.5922330097087378,\n \"acc_norm_stderr\": 0.048657775704107696\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n \"acc_stderr\": 0.030351527323344927,\n \"acc_norm\": 0.688034188034188,\n \"acc_norm_stderr\": 0.030351527323344927\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6538952745849298,\n \"acc_stderr\": 0.01701196526641207,\n \"acc_norm\": 0.6538952745849298,\n \"acc_norm_stderr\": 0.01701196526641207\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2245810055865922,\n \"acc_stderr\": 0.01395680366654464,\n \"acc_norm\": 0.2245810055865922,\n \"acc_norm_stderr\": 0.01395680366654464\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.02862441255016795,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.02862441255016795\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n \"acc_stderr\": 0.028320325830105915,\n \"acc_norm\": 0.5369774919614148,\n \"acc_norm_stderr\": 0.028320325830105915\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347666,\n \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347666\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32659713168187743,\n \"acc_stderr\": 0.011977676704715999,\n \"acc_norm\": 0.32659713168187743,\n \"acc_norm_stderr\": 0.011977676704715999\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.02952009569768776,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.02952009569768776\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.020054269200726463,\n \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.020054269200726463\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5142857142857142,\n \"acc_stderr\": 0.03199615232806287,\n \"acc_norm\": 0.5142857142857142,\n \"acc_norm_stderr\": 0.03199615232806287\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.5572139303482587,\n \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310935,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310935\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.43943952364740935,\n \"mc2_stderr\": 0.014972022232931708\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893129\n },\n \"harness|drop|3\": {\n \"em\": 0.014786073825503355,\n \"em_stderr\": 0.0012360366760473,\n \"f1\": 0.07986682046979873,\n \"f1_stderr\": 0.0018932315277158172\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08794541319181198,\n \"acc_stderr\": 0.007801162197487717\n }\n}\n```", "repo_url": "https://huggingface.co/922-CA/monika-ddlc-7b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|arc:challenge|25_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|arc:challenge|25_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|arc:challenge|25_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|arc:challenge|25_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|drop|3_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|drop|3_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|drop|3_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|drop|3_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|gsm8k|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|gsm8k|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|gsm8k|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|gsm8k|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hellaswag|10_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hellaswag|10_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hellaswag|10_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hellaswag|10_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T08-41-13.979369.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T08-54-08.112439.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T09-01-28.625057.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T09-14-32.105444.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["**/details_harness|winogrande|5_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["**/details_harness|winogrande|5_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["**/details_harness|winogrande|5_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["**/details_harness|winogrande|5_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-23T09-14-32.105444.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_23T08_41_13.979369", "path": ["results_2023-11-23T08-41-13.979369.parquet"]}, {"split": "2023_11_23T08_54_08.112439", "path": ["results_2023-11-23T08-54-08.112439.parquet"]}, {"split": "2023_11_23T09_01_28.625057", "path": ["results_2023-11-23T09-01-28.625057.parquet"]}, {"split": "2023_11_23T09_14_32.105444", "path": ["results_2023-11-23T09-14-32.105444.parquet"]}, {"split": "latest", "path": ["results_2023-11-23T09-14-32.105444.parquet"]}]}]}
2023-11-23T09:17:44+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of 922-CA/monika-ddlc-7b-v1 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model 922-CA/monika-ddlc-7b-v1 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-11-23T09:14:32.105444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of 922-CA/monika-ddlc-7b-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model 922-CA/monika-ddlc-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-23T09:14:32.105444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of 922-CA/monika-ddlc-7b-v1", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model 922-CA/monika-ddlc-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-11-23T09:14:32.105444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 172, 66, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of 922-CA/monika-ddlc-7b-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model 922-CA/monika-ddlc-7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-23T09:14:32.105444(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
35b73a317a064845cd2bdf0a82b1cc4623ebe74d
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_after_sent_train_400_eval_40_random_permute_1
[ "region:us" ]
2023-11-23T08:45:28+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3057558.7870563674, "num_examples": 2434}, {"name": "validation", "num_bytes": 232483, "num_examples": 200}], "download_size": 1040869, "dataset_size": 3290041.7870563674}}
2023-11-23T08:45:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_1" More Information needed
[ "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
[ 6, 36 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
9ff3387def03e4bdcbb2e6d24028f4ab56dcd421
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_after_sent_train_400_eval_40_random_permute_4
[ "region:us" ]
2023-11-23T08:46:19+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5042967.317596567, "num_examples": 3754}, {"name": "validation", "num_bytes": 232483, "num_examples": 200}], "download_size": 1200503, "dataset_size": 5275450.317596567}}
2023-11-23T08:46:27+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_4" More Information needed
[ "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
[ 6, 37 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
9ffc1c57422233ecb97b8ac9c4bb905c1d20ab09
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_8" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_after_sent_train_400_eval_40_random_permute_8
[ "region:us" ]
2023-11-23T08:46:42+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7716633.701377225, "num_examples": 5514}, {"name": "validation", "num_bytes": 232483, "num_examples": 200}], "download_size": 1305162, "dataset_size": 7949116.701377225}}
2023-11-23T08:46:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_after_sent_train_400_eval_40_random_permute_8" More Information needed
[ "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
[ 6, 37 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_after_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
7d9da642cc7ab5a0e29deb37591b799685e4592f
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_before_sent_train_400_eval_40_random_permute_1
[ "region:us" ]
2023-11-23T08:47:08+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3060300.2129436326, "num_examples": 2434}, {"name": "validation", "num_bytes": 232610, "num_examples": 200}], "download_size": 1042600, "dataset_size": 3292910.2129436326}}
2023-11-23T08:47:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_1" More Information needed
[ "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
[ 6, 36 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
9b98e8efb2760dad24c53d82c1bfe6339ee32bce
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_before_sent_train_400_eval_40_random_permute_2
[ "region:us" ]
2023-11-23T08:47:33+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3718773.8817139408, "num_examples": 2874}, {"name": "validation", "num_bytes": 232610, "num_examples": 200}], "download_size": 1123950, "dataset_size": 3951383.8817139408}}
2023-11-23T08:47:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_2" More Information needed
[ "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
[ 6, 37 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
2b5df63415159a24f4730c468961af5e636f3624
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_before_sent_train_400_eval_40_random_permute_4
[ "region:us" ]
2023-11-23T08:47:58+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5045864.718168813, "num_examples": 3754}, {"name": "validation", "num_bytes": 232610, "num_examples": 200}], "download_size": 1204239, "dataset_size": 5278474.718168813}}
2023-11-23T08:48:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_4" More Information needed
[ "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
[ 6, 37 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
5fca1135621aa0d00f3084bc09f35a493828b97e
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_8" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_sent_before_sent_train_400_eval_40_random_permute_8
[ "region:us" ]
2023-11-23T08:48:21+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7719631.487403426, "num_examples": 5514}, {"name": "validation", "num_bytes": 232610, "num_examples": 200}], "download_size": 1303658, "dataset_size": 7952241.487403426}}
2023-11-23T08:48:29+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_sent_before_sent_train_400_eval_40_random_permute_8" More Information needed
[ "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
[ 6, 37 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_sent_before_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
a69a22e1653c484a7ed4069c3c55d198ae744cfd
# Dataset Card for "find_marker_both_sent_train_400_eval_40_random_permute_1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_marker_both_sent_train_400_eval_40_random_permute_1
[ "region:us" ]
2023-11-23T08:48:47+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3022759.5316631873, "num_examples": 2434}, {"name": "validation", "num_bytes": 220570, "num_examples": 200}], "download_size": 895275, "dataset_size": 3243329.5316631873}}
2023-11-23T08:48:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_marker_both_sent_train_400_eval_40_random_permute_1" More Information needed
[ "# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
[ 6, 37 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_1\"\n\nMore Information needed" ]
2651532d456c68e86a7014fec21a6b26a042f6bb
# Dataset Card for "find_marker_both_sent_train_400_eval_40_random_permute_2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_marker_both_sent_train_400_eval_40_random_permute_2
[ "region:us" ]
2023-11-23T08:49:12+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3710817.0470730234, "num_examples": 2874}, {"name": "validation", "num_bytes": 220570, "num_examples": 200}], "download_size": 981619, "dataset_size": 3931387.0470730234}}
2023-11-23T08:49:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_marker_both_sent_train_400_eval_40_random_permute_2" More Information needed
[ "# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
[ 6, 38 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_2\"\n\nMore Information needed" ]
9a3147462c295b38c529b54851fae333fd9c962c
# Dataset Card for "find_marker_both_sent_train_400_eval_40_random_permute_4" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_marker_both_sent_train_400_eval_40_random_permute_4
[ "region:us" ]
2023-11-23T08:49:36+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5100580.566046733, "num_examples": 3754}, {"name": "validation", "num_bytes": 220570, "num_examples": 200}], "download_size": 1075090, "dataset_size": 5321150.566046733}}
2023-11-23T08:49:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_marker_both_sent_train_400_eval_40_random_permute_4" More Information needed
[ "# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
[ 6, 38 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_4\"\n\nMore Information needed" ]
1ebcd41a8fdcab0ef5cb3abf7b7b66ec96139974
# Dataset Card for "find_marker_both_sent_train_400_eval_40_random_permute_8" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tyzhu/find_marker_both_sent_train_400_eval_40_random_permute_8
[ "region:us" ]
2023-11-23T08:50:02+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7906460.143432986, "num_examples": 5514}, {"name": "validation", "num_bytes": 220570, "num_examples": 200}], "download_size": 1195516, "dataset_size": 8127030.143432986}}
2023-11-23T08:50:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for "find_marker_both_sent_train_400_eval_40_random_permute_8" More Information needed
[ "# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
[ 6, 38 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"find_marker_both_sent_train_400_eval_40_random_permute_8\"\n\nMore Information needed" ]
6f3e016dd1db655061aea6faae689d045f2ed08a
# Dataset Card for Evaluation run of microsoft/Orca-2-13b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [microsoft/Orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_microsoft__Orca-2-13b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-30T00:44:18.166149](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-13b/blob/main/results_2023-12-30T00-44-18.166149.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.601679092820444, "acc_stderr": 0.03296876808787226, "acc_norm": 0.6064308784221981, "acc_norm_stderr": 0.03364034807631641, "mc1": 0.3990208078335373, "mc1_stderr": 0.017142825728496767, "mc2": 0.5642038222037025, "mc2_stderr": 0.01593463688746652 }, "harness|arc:challenge|25": { "acc": 0.5742320819112628, "acc_stderr": 0.014449464278868802, "acc_norm": 0.6092150170648464, "acc_norm_stderr": 0.014258563880513778 }, "harness|hellaswag|10": { "acc": 0.6126269667396933, "acc_stderr": 0.004861544478451861, "acc_norm": 0.798546106353316, "acc_norm_stderr": 0.004002665957282747 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.042039210401562783, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.042039210401562783 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.743421052631579, "acc_stderr": 0.03554180368025689, "acc_norm": 0.743421052631579, "acc_norm_stderr": 0.03554180368025689 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6226415094339622, "acc_stderr": 0.029832808114796005, "acc_norm": 0.6226415094339622, "acc_norm_stderr": 0.029832808114796005 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6736111111111112, "acc_stderr": 0.03921067198982266, "acc_norm": 0.6736111111111112, "acc_norm_stderr": 0.03921067198982266 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.049756985195624284, "acc_norm": 0.43, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5549132947976878, "acc_stderr": 0.03789401760283647, "acc_norm": 0.5549132947976878, "acc_norm_stderr": 0.03789401760283647 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.04690650298201943, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.04690650298201943 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.67, "acc_stderr": 0.04725815626252609, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252609 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.032469569197899575, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.032469569197899575 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2894736842105263, "acc_stderr": 0.04266339443159394, "acc_norm": 0.2894736842105263, "acc_norm_stderr": 0.04266339443159394 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36772486772486773, "acc_stderr": 0.02483383982556242, "acc_norm": 0.36772486772486773, "acc_norm_stderr": 0.02483383982556242 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.36507936507936506, "acc_stderr": 0.04306241259127153, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.04306241259127153 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7387096774193549, "acc_stderr": 0.02499305339776481, "acc_norm": 0.7387096774193549, "acc_norm_stderr": 0.02499305339776481 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7333333333333333, "acc_stderr": 0.03453131801885415, "acc_norm": 0.7333333333333333, "acc_norm_stderr": 0.03453131801885415 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7323232323232324, "acc_stderr": 0.03154449888270285, "acc_norm": 0.7323232323232324, "acc_norm_stderr": 0.03154449888270285 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.026499057701397447, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.026499057701397447 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5923076923076923, "acc_stderr": 0.024915243985987847, "acc_norm": 0.5923076923076923, "acc_norm_stderr": 0.024915243985987847 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066475, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066475 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6302521008403361, "acc_stderr": 0.03135709599613591, "acc_norm": 0.6302521008403361, "acc_norm_stderr": 0.03135709599613591 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.038615575462551684, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.038615575462551684 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8146788990825689, "acc_stderr": 0.01665927970029582, "acc_norm": 0.8146788990825689, "acc_norm_stderr": 0.01665927970029582 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.03407632093854052, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.03407632093854052 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.803921568627451, "acc_stderr": 0.027865942286639325, "acc_norm": 0.803921568627451, "acc_norm_stderr": 0.027865942286639325 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8185654008438819, "acc_stderr": 0.025085961144579647, "acc_norm": 0.8185654008438819, "acc_norm_stderr": 0.025085961144579647 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.03749492448709697, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.03749492448709697 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7239263803680982, "acc_stderr": 0.035123852837050475, "acc_norm": 0.7239263803680982, "acc_norm_stderr": 0.035123852837050475 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.36607142857142855, "acc_stderr": 0.0457237235873743, "acc_norm": 0.36607142857142855, "acc_norm_stderr": 0.0457237235873743 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690876, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690876 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7816091954022989, "acc_stderr": 0.014774358319934504, "acc_norm": 0.7816091954022989, "acc_norm_stderr": 0.014774358319934504 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6791907514450867, "acc_stderr": 0.025131000233647897, "acc_norm": 0.6791907514450867, "acc_norm_stderr": 0.025131000233647897 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3128491620111732, "acc_stderr": 0.01550689259464727, "acc_norm": 0.3128491620111732, "acc_norm_stderr": 0.01550689259464727 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6699346405228758, "acc_stderr": 0.026925654653615697, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.026925654653615697 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6816720257234726, "acc_stderr": 0.026457225067811025, "acc_norm": 0.6816720257234726, "acc_norm_stderr": 0.026457225067811025 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7098765432098766, "acc_stderr": 0.025251173936495026, "acc_norm": 0.7098765432098766, "acc_norm_stderr": 0.025251173936495026 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.450354609929078, "acc_stderr": 0.029680105565029036, "acc_norm": 0.450354609929078, "acc_norm_stderr": 0.029680105565029036 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4367666232073012, "acc_stderr": 0.01266770191960367, "acc_norm": 0.4367666232073012, "acc_norm_stderr": 0.01266770191960367 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5698529411764706, "acc_stderr": 0.030074971917302875, "acc_norm": 0.5698529411764706, "acc_norm_stderr": 0.030074971917302875 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6062091503267973, "acc_stderr": 0.019766211991073066, "acc_norm": 0.6062091503267973, "acc_norm_stderr": 0.019766211991073066 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6454545454545455, "acc_stderr": 0.045820048415054174, "acc_norm": 0.6454545454545455, "acc_norm_stderr": 0.045820048415054174 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.736318407960199, "acc_stderr": 0.03115715086935557, "acc_norm": 0.736318407960199, "acc_norm_stderr": 0.03115715086935557 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036624, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036624 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8011695906432749, "acc_stderr": 0.030611116557432528, "acc_norm": 0.8011695906432749, "acc_norm_stderr": 0.030611116557432528 }, "harness|truthfulqa:mc|0": { "mc1": 0.3990208078335373, "mc1_stderr": 0.017142825728496767, "mc2": 0.5642038222037025, "mc2_stderr": 0.01593463688746652 }, "harness|winogrande|5": { "acc": 0.7655880031570639, "acc_stderr": 0.011906130106237988 }, "harness|gsm8k|5": { "acc": 0.378316906747536, "acc_stderr": 0.013358407831777126 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_microsoft__Orca-2-13b
[ "region:us" ]
2023-11-23T09:04:05+00:00
{"pretty_name": "Evaluation run of microsoft/Orca-2-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [microsoft/Orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__Orca-2-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-30T00:44:18.166149](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-13b/blob/main/results_2023-12-30T00-44-18.166149.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.601679092820444,\n \"acc_stderr\": 0.03296876808787226,\n \"acc_norm\": 0.6064308784221981,\n \"acc_norm_stderr\": 0.03364034807631641,\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5642038222037025,\n \"mc2_stderr\": 0.01593463688746652\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868802,\n \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513778\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6126269667396933,\n \"acc_stderr\": 0.004861544478451861,\n \"acc_norm\": 0.798546106353316,\n \"acc_norm_stderr\": 0.004002665957282747\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.03554180368025689,\n \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.03554180368025689\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.02483383982556242,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.02483383982556242\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.02499305339776481,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.02499305339776481\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397447,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690876,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690876\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n \"acc_stderr\": 0.014774358319934504,\n \"acc_norm\": 0.7816091954022989,\n \"acc_norm_stderr\": 0.014774358319934504\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n \"acc_stderr\": 0.01550689259464727,\n \"acc_norm\": 0.3128491620111732,\n \"acc_norm_stderr\": 0.01550689259464727\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615697,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615697\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n \"acc_stderr\": 0.01266770191960367,\n \"acc_norm\": 0.4367666232073012,\n \"acc_norm_stderr\": 0.01266770191960367\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073066,\n \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073066\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5642038222037025,\n \"mc2_stderr\": 0.01593463688746652\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237988\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.378316906747536,\n \"acc_stderr\": 0.013358407831777126\n }\n}\n```", "repo_url": "https://huggingface.co/microsoft/Orca-2-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|arc:challenge|25_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|arc:challenge|25_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|drop|3_2023-11-23T09-00-59.774377.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-23T09-00-59.774377.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|gsm8k|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|gsm8k|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hellaswag|10_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hellaswag|10_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T09-00-59.774377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-30T00-44-18.166149.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["**/details_harness|winogrande|5_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["**/details_harness|winogrande|5_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-30T00-44-18.166149.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_23T09_00_59.774377", "path": ["results_2023-11-23T09-00-59.774377.parquet"]}, {"split": "2023_12_30T00_44_18.166149", "path": ["results_2023-12-30T00-44-18.166149.parquet"]}, {"split": "latest", "path": ["results_2023-12-30T00-44-18.166149.parquet"]}]}]}
2023-12-30T00:47:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of microsoft/Orca-2-13b Dataset automatically created during the evaluation run of model microsoft/Orca-2-13b on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-30T00:44:18.166149(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of microsoft/Orca-2-13b\n\n\n\nDataset automatically created during the evaluation run of model microsoft/Orca-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-30T00:44:18.166149(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of microsoft/Orca-2-13b\n\n\n\nDataset automatically created during the evaluation run of model microsoft/Orca-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-30T00:44:18.166149(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ 6, 177, 66, 4, 40, 29, 3, 4, 9, 6, 5, 7, 4, 7, 10, 9, 5, 9, 8, 10, 46, 8, 7, 10, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of microsoft/Orca-2-13b\n\n\n\nDataset automatically created during the evaluation run of model microsoft/Orca-2-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-30T00:44:18.166149(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact" ]
da123249c94ce0f6d448082ffba467cd0a518b3d
# Datasets for the Direct Preference for Denoising Diffusion Policy Optimization (D3PO) **Description**: The dataset for the image distortion experiment of the [`anything-v5`](https://huggingface.co/stablediffusionapi/anything-v5) model in the paper [Using Human Feedback to Fine-tune Diffusion Models without Any Reward Model](https://arxiv.org/abs/2311.13231). (2024.1.22 Update: Add the dataset for evaluating text and image alignment before and after fine-tuning.) **Source Code**: The code used to generate this data can be found [here](https://github.com/yk7333/D3PO/tree/main). **Directory** - d3po_dataset - epoch1 - all_img - *.png - deformed_img - *.png - json - data.json (required for training) - prompt.json - sample.pkl(required for training) - epoch2` - ... - epoch5 - text2img_dataset: - img - data_*.json - plot.ipynb - prompt.txt **Citation** ``` @article{yang2023using, title={Using Human Feedback to Fine-tune Diffusion Models without Any Reward Model}, author={Yang, Kai and Tao, Jian and Lyu, Jiafei and Ge, Chunjiang and Chen, Jiaxin and Li, Qimai and Shen, Weihan and Zhu, Xiaolong and Li, Xiu}, journal={arXiv preprint arXiv:2311.13231}, year={2023} } ```
yangkaiSIGS/d3po_datasets
[ "arxiv:2311.13231", "region:us" ]
2023-11-23T09:05:34+00:00
{}
2024-01-22T11:27:02+00:00
[ "2311.13231" ]
[]
TAGS #arxiv-2311.13231 #region-us
# Datasets for the Direct Preference for Denoising Diffusion Policy Optimization (D3PO) Description: The dataset for the image distortion experiment of the 'anything-v5' model in the paper Using Human Feedback to Fine-tune Diffusion Models without Any Reward Model. (2024.1.22 Update: Add the dataset for evaluating text and image alignment before and after fine-tuning.) Source Code: The code used to generate this data can be found here. Directory - d3po_dataset - epoch1 - all_img - *.png - deformed_img - *.png - json - URL (required for training) - URL - URL(required for training) - epoch2' - ... - epoch5 - text2img_dataset: - img - data_*.json - URL - URL Citation
[ "# Datasets for the Direct Preference for Denoising Diffusion Policy Optimization (D3PO)\n\nDescription: The dataset for the image distortion experiment of the 'anything-v5' model in the paper Using Human Feedback to Fine-tune Diffusion Models without Any Reward Model.\n (2024.1.22 Update: Add the dataset for evaluating text and image alignment before and after fine-tuning.)\n \nSource Code: The code used to generate this data can be found here.\n\nDirectory\n- d3po_dataset\n - epoch1\n - all_img\n - *.png\n - deformed_img\n - *.png\n - json\n - URL (required for training)\n - URL\n - URL(required for training)\n - epoch2'\n - ...\n - epoch5\n\n \n- text2img_dataset:\n - img\n - data_*.json\n - URL\n - URL\n\nCitation" ]
[ "TAGS\n#arxiv-2311.13231 #region-us \n", "# Datasets for the Direct Preference for Denoising Diffusion Policy Optimization (D3PO)\n\nDescription: The dataset for the image distortion experiment of the 'anything-v5' model in the paper Using Human Feedback to Fine-tune Diffusion Models without Any Reward Model.\n (2024.1.22 Update: Add the dataset for evaluating text and image alignment before and after fine-tuning.)\n \nSource Code: The code used to generate this data can be found here.\n\nDirectory\n- d3po_dataset\n - epoch1\n - all_img\n - *.png\n - deformed_img\n - *.png\n - json\n - URL (required for training)\n - URL\n - URL(required for training)\n - epoch2'\n - ...\n - epoch5\n\n \n- text2img_dataset:\n - img\n - data_*.json\n - URL\n - URL\n\nCitation" ]
[ 15, 206 ]
[ "passage: TAGS\n#arxiv-2311.13231 #region-us \n# Datasets for the Direct Preference for Denoising Diffusion Policy Optimization (D3PO)\n\nDescription: The dataset for the image distortion experiment of the 'anything-v5' model in the paper Using Human Feedback to Fine-tune Diffusion Models without Any Reward Model.\n (2024.1.22 Update: Add the dataset for evaluating text and image alignment before and after fine-tuning.)\n \nSource Code: The code used to generate this data can be found here.\n\nDirectory\n- d3po_dataset\n - epoch1\n - all_img\n - *.png\n - deformed_img\n - *.png\n - json\n - URL (required for training)\n - URL\n - URL(required for training)\n - epoch2'\n - ...\n - epoch5\n\n \n- text2img_dataset:\n - img\n - data_*.json\n - URL\n - URL\n\nCitation" ]
f27b942bd7ae0d46f47e19206fbaa271a95ed174
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/perlthoughts/Chupacabra-7B-v2 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** [email protected] ### Dataset Summary Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2](https://huggingface.co/perlthoughts/Chupacabra-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-04T18:02:58.053786](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2/blob/main/results_2023-12-04T18-02-58.053786.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6367599015709243, "acc_stderr": 0.03218025799515212, "acc_norm": 0.6396357428050704, "acc_norm_stderr": 0.0328187456646889, "mc1": 0.397796817625459, "mc1_stderr": 0.017133934248559635, "mc2": 0.5717077514762566, "mc2_stderr": 0.0156197692783717 }, "harness|arc:challenge|25": { "acc": 0.613481228668942, "acc_stderr": 0.014230084761910478, "acc_norm": 0.6518771331058021, "acc_norm_stderr": 0.013921008595179342 }, "harness|hellaswag|10": { "acc": 0.6473809998008365, "acc_stderr": 0.004768088918512182, "acc_norm": 0.8338976299541924, "acc_norm_stderr": 0.003714118884317389 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.03782728980865469, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.03782728980865469 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6944444444444444, "acc_stderr": 0.03852084696008534, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.03852084696008534 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6127167630057804, "acc_stderr": 0.03714325906302065, "acc_norm": 0.6127167630057804, "acc_norm_stderr": 0.03714325906302065 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.548936170212766, "acc_stderr": 0.03252909619613197, "acc_norm": 0.548936170212766, "acc_norm_stderr": 0.03252909619613197 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5087719298245614, "acc_stderr": 0.047028804320496165, "acc_norm": 0.5087719298245614, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3994708994708995, "acc_stderr": 0.02522545028406788, "acc_norm": 0.3994708994708995, "acc_norm_stderr": 0.02522545028406788 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7709677419354839, "acc_stderr": 0.023904914311782658, "acc_norm": 0.7709677419354839, "acc_norm_stderr": 0.023904914311782658 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.030532892233932022, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.030532892233932022 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6461538461538462, "acc_stderr": 0.024243783994062157, "acc_norm": 0.6461538461538462, "acc_norm_stderr": 0.024243783994062157 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683515, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683515 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6596638655462185, "acc_stderr": 0.03077805742293167, "acc_norm": 0.6596638655462185, "acc_norm_stderr": 0.03077805742293167 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2913907284768212, "acc_stderr": 0.037101857261199946, "acc_norm": 0.2913907284768212, "acc_norm_stderr": 0.037101857261199946 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8293577981651377, "acc_stderr": 0.016129271025099867, "acc_norm": 0.8293577981651377, "acc_norm_stderr": 0.016129271025099867 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.025955020841621126, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.025955020841621126 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7404580152671756, "acc_stderr": 0.03844876139785271, "acc_norm": 0.7404580152671756, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.8252427184466019, "acc_stderr": 0.037601780060266196, "acc_norm": 0.8252427184466019, "acc_norm_stderr": 0.037601780060266196 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.02190190511507333, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.02190190511507333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973136, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973136 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6820809248554913, "acc_stderr": 0.025070713719153183, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.025070713719153183 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4033519553072626, "acc_stderr": 0.01640712303219525, "acc_norm": 0.4033519553072626, "acc_norm_stderr": 0.01640712303219525 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7058823529411765, "acc_stderr": 0.026090162504279056, "acc_norm": 0.7058823529411765, "acc_norm_stderr": 0.026090162504279056 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153273, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153273 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7037037037037037, "acc_stderr": 0.025407197798890162, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.025407197798890162 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4787234042553192, "acc_stderr": 0.029800481645628693, "acc_norm": 0.4787234042553192, "acc_norm_stderr": 0.029800481645628693 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4602346805736636, "acc_stderr": 0.01272978538659856, "acc_norm": 0.4602346805736636, "acc_norm_stderr": 0.01272978538659856 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6654411764705882, "acc_stderr": 0.028661996202335303, "acc_norm": 0.6654411764705882, "acc_norm_stderr": 0.028661996202335303 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6519607843137255, "acc_stderr": 0.01927099870822398, "acc_norm": 0.6519607843137255, "acc_norm_stderr": 0.01927099870822398 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.027529637440174923, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.027529637440174923 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616913, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616913 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8128654970760234, "acc_stderr": 0.02991312723236804, "acc_norm": 0.8128654970760234, "acc_norm_stderr": 0.02991312723236804 }, "harness|truthfulqa:mc|0": { "mc1": 0.397796817625459, "mc1_stderr": 0.017133934248559635, "mc2": 0.5717077514762566, "mc2_stderr": 0.0156197692783717 }, "harness|winogrande|5": { "acc": 0.7813733228097869, "acc_stderr": 0.01161619821577323 }, "harness|gsm8k|5": { "acc": 0.5473843821076573, "acc_stderr": 0.013710499070935132 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2
[ "region:us" ]
2023-11-23T09:09:05+00:00
{"pretty_name": "Evaluation run of perlthoughts/Chupacabra-7B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2](https://huggingface.co/perlthoughts/Chupacabra-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T18:02:58.053786](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2/blob/main/results_2023-12-04T18-02-58.053786.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6367599015709243,\n \"acc_stderr\": 0.03218025799515212,\n \"acc_norm\": 0.6396357428050704,\n \"acc_norm_stderr\": 0.0328187456646889,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5717077514762566,\n \"mc2_stderr\": 0.0156197692783717\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.613481228668942,\n \"acc_stderr\": 0.014230084761910478,\n \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179342\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6473809998008365,\n \"acc_stderr\": 0.004768088918512182,\n \"acc_norm\": 0.8338976299541924,\n \"acc_norm_stderr\": 0.003714118884317389\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782658,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782658\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099867,\n \"acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099867\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153183,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n \"acc_stderr\": 0.01640712303219525,\n \"acc_norm\": 0.4033519553072626,\n \"acc_norm_stderr\": 0.01640712303219525\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153273,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153273\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6519607843137255,\n \"acc_stderr\": 0.01927099870822398,\n \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.01927099870822398\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5717077514762566,\n \"mc2_stderr\": 0.0156197692783717\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.01161619821577323\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5473843821076573,\n \"acc_stderr\": 0.013710499070935132\n }\n}\n```", "repo_url": "https://huggingface.co/perlthoughts/Chupacabra-7B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|arc:challenge|25_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|arc:challenge|25_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|drop|3_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|drop|3_2023-11-23T09-18-59.989572.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-23T09-18-59.989572.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|gsm8k|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|gsm8k|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_03T15_21_26.428024", "path": ["**/details_harness|gsm8k|5_2023-12-03T15-21-26.428024.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hellaswag|10_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hellaswag|10_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T09-06-05.823190.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-management|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-virology|5_2023-11-23T09-18-59.989572.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-04T18-02-58.053786.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-management|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-virology|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|truthfulqa:mc|0_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["**/details_harness|winogrande|5_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["**/details_harness|winogrande|5_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["**/details_harness|winogrande|5_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-04T18-02-58.053786.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_23T09_06_05.823190", "path": ["results_2023-11-23T09-06-05.823190.parquet"]}, {"split": "2023_11_23T09_18_59.989572", "path": ["results_2023-11-23T09-18-59.989572.parquet"]}, {"split": "2023_12_03T15_21_26.428024", "path": ["results_2023-12-03T15-21-26.428024.parquet"]}, {"split": "2023_12_04T18_02_58.053786", "path": ["results_2023-12-04T18-02-58.053786.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T18-02-58.053786.parquet"]}]}]}
2023-12-04T18:06:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2 ## Dataset Description - Homepage: - Repository: URL - Paper: - Leaderboard: URL - Point of Contact: clementine@URL ### Dataset Summary Dataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2 on the Open LLM Leaderboard. The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2023-12-04T18:02:58.053786(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ### Supported Tasks and Leaderboards ### Languages ## Dataset Structure ### Data Instances ### Data Fields ### Data Splits ## Dataset Creation ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information ## Considerations for Using the Data ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations ## Additional Information ### Dataset Curators ### Licensing Information ### Contributions
[ "# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T18:02:58.053786(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2", "## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL", "### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2023-12-04T18:02:58.053786(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "### Supported Tasks and Leaderboards", "### Languages", "## Dataset Structure", "### Data Instances", "### Data Fields", "### Data Splits", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information", "## Considerations for Using the Data", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations", "## Additional Information", "### Dataset Curators", "### Licensing Information", "### Contributions" ]
[ 6, 23, 31, 172, 67, 10, 4, 6, 6, 5, 5, 5, 7, 4, 10, 10, 5, 5, 9, 8, 8, 7, 8, 7, 5, 6, 6, 5 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T18:02:58.053786(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions" ]
f4893d4704bc82cfc45206a8bd991aff6c5a8480
# Dataset Card for Open Prompt Answers ## Dataset Summary This dataset provides answers from different Large Language models to prompts from several public datasets. + `prompt`: a prompt from an open-source dataset + `prompt_origin`: the dataset the prompt is taken from + `Llama-2-7b-chat-hf_output`: output generation of [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) model + `Llama-2-7b-chat-hf_generation_time`: generation duration *in seconds* for the answer of [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) model + `oasst-sft-4-pythia-12b_output`: output generation of [OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5](https://huggingface.co/OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5) model + `oasst-sft-4-pythia-12b_generation_time`: generation duration *in seconds* for the answer of [OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5](https://huggingface.co/OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5) model + `vicuna-7b-v1.5_output`: output generation of [lmsys/vicuna-7b-v1.5](https://huggingface.co/lmsys/vicuna-7b-v1.5) model + `vicuna-7b-v1.5_generation_time`: generation duration *in seconds* for the answer of [lmsys/vicuna-7b-v1.5](https://huggingface.co/lmsys/vicuna-7b-v1.5) model ## Prompt Sources The prompts are a subset of all prompts of the following datasets: + [OpenAssistant/oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1): only english prompts with no previous conversation tree (`role = prompter` and `parent_id = null`) + [Anthropic/hh-rlhf](https://huggingface.co/datasets/Anthropic/hh-rlhf): only the initial input of the *Human* as prompt + [tatsu-lab/alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca): concatenated `instruction` and `input` to form prompt + [Dahoas/synthetic-instruct-gptj-pairwise](https://huggingface.co/datasets/Dahoas/synthetic-instruct-gptj-pairwise): prompts from `prompt` column ## Output Generation The configuration is the same for each model: + `temperature`: 0.7 + `max_new_tokens`: 512 + `repetition_penalty`: 1.0 The generation duration is provided (in seconds).
benmainbird/prompt_answers_v1
[ "language:en", "llm", "prompts", "answers", "region:us" ]
2023-11-23T09:40:37+00:00
{"language": ["en"], "pretty_name": "Open Prompt LLM Answers", "tags": ["llm", "prompts", "answers"]}
2023-11-25T09:28:17+00:00
[]
[ "en" ]
TAGS #language-English #llm #prompts #answers #region-us
# Dataset Card for Open Prompt Answers ## Dataset Summary This dataset provides answers from different Large Language models to prompts from several public datasets. + 'prompt': a prompt from an open-source dataset + 'prompt_origin': the dataset the prompt is taken from + 'Llama-2-7b-chat-hf_output': output generation of meta-llama/Llama-2-7b-chat-hf model + 'Llama-2-7b-chat-hf_generation_time': generation duration *in seconds* for the answer of meta-llama/Llama-2-7b-chat-hf model + 'oasst-sft-4-pythia-12b_output': output generation of OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 model + 'oasst-sft-4-pythia-12b_generation_time': generation duration *in seconds* for the answer of OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 model + 'vicuna-7b-v1.5_output': output generation of lmsys/vicuna-7b-v1.5 model + 'vicuna-7b-v1.5_generation_time': generation duration *in seconds* for the answer of lmsys/vicuna-7b-v1.5 model ## Prompt Sources The prompts are a subset of all prompts of the following datasets: + OpenAssistant/oasst1: only english prompts with no previous conversation tree ('role = prompter' and 'parent_id = null') + Anthropic/hh-rlhf: only the initial input of the *Human* as prompt + tatsu-lab/alpaca: concatenated 'instruction' and 'input' to form prompt + Dahoas/synthetic-instruct-gptj-pairwise: prompts from 'prompt' column ## Output Generation The configuration is the same for each model: + 'temperature': 0.7 + 'max_new_tokens': 512 + 'repetition_penalty': 1.0 The generation duration is provided (in seconds).
[ "# Dataset Card for Open Prompt Answers", "## Dataset Summary\n\nThis dataset provides answers from different Large Language models to prompts from several public datasets.\n\n+ 'prompt': a prompt from an open-source dataset\n+ 'prompt_origin': the dataset the prompt is taken from\n+ 'Llama-2-7b-chat-hf_output': output generation of meta-llama/Llama-2-7b-chat-hf model\n+ 'Llama-2-7b-chat-hf_generation_time': generation duration *in seconds* for the answer of meta-llama/Llama-2-7b-chat-hf model\n+ 'oasst-sft-4-pythia-12b_output': output generation of OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 model\n+ 'oasst-sft-4-pythia-12b_generation_time': generation duration *in seconds* for the answer of OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 model\n+ 'vicuna-7b-v1.5_output': output generation of lmsys/vicuna-7b-v1.5 model\n+ 'vicuna-7b-v1.5_generation_time': generation duration *in seconds* for the answer of lmsys/vicuna-7b-v1.5 model", "## Prompt Sources\n\nThe prompts are a subset of all prompts of the following datasets:\n\n+ OpenAssistant/oasst1: only english prompts with no previous conversation tree ('role = prompter' and 'parent_id = null')\n+ Anthropic/hh-rlhf: only the initial input of the *Human* as prompt\n+ tatsu-lab/alpaca: concatenated 'instruction' and 'input' to form prompt\n+ Dahoas/synthetic-instruct-gptj-pairwise: prompts from 'prompt' column", "## Output Generation\n\nThe configuration is the same for each model:\n\n+ 'temperature': 0.7\n+ 'max_new_tokens': 512\n+ 'repetition_penalty': 1.0\n\nThe generation duration is provided (in seconds)." ]
[ "TAGS\n#language-English #llm #prompts #answers #region-us \n", "# Dataset Card for Open Prompt Answers", "## Dataset Summary\n\nThis dataset provides answers from different Large Language models to prompts from several public datasets.\n\n+ 'prompt': a prompt from an open-source dataset\n+ 'prompt_origin': the dataset the prompt is taken from\n+ 'Llama-2-7b-chat-hf_output': output generation of meta-llama/Llama-2-7b-chat-hf model\n+ 'Llama-2-7b-chat-hf_generation_time': generation duration *in seconds* for the answer of meta-llama/Llama-2-7b-chat-hf model\n+ 'oasst-sft-4-pythia-12b_output': output generation of OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 model\n+ 'oasst-sft-4-pythia-12b_generation_time': generation duration *in seconds* for the answer of OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 model\n+ 'vicuna-7b-v1.5_output': output generation of lmsys/vicuna-7b-v1.5 model\n+ 'vicuna-7b-v1.5_generation_time': generation duration *in seconds* for the answer of lmsys/vicuna-7b-v1.5 model", "## Prompt Sources\n\nThe prompts are a subset of all prompts of the following datasets:\n\n+ OpenAssistant/oasst1: only english prompts with no previous conversation tree ('role = prompter' and 'parent_id = null')\n+ Anthropic/hh-rlhf: only the initial input of the *Human* as prompt\n+ tatsu-lab/alpaca: concatenated 'instruction' and 'input' to form prompt\n+ Dahoas/synthetic-instruct-gptj-pairwise: prompts from 'prompt' column", "## Output Generation\n\nThe configuration is the same for each model:\n\n+ 'temperature': 0.7\n+ 'max_new_tokens': 512\n+ 'repetition_penalty': 1.0\n\nThe generation duration is provided (in seconds)." ]
[ 20, 11, 310, 136, 54 ]
[ "passage: TAGS\n#language-English #llm #prompts #answers #region-us \n# Dataset Card for Open Prompt Answers## Dataset Summary\n\nThis dataset provides answers from different Large Language models to prompts from several public datasets.\n\n+ 'prompt': a prompt from an open-source dataset\n+ 'prompt_origin': the dataset the prompt is taken from\n+ 'Llama-2-7b-chat-hf_output': output generation of meta-llama/Llama-2-7b-chat-hf model\n+ 'Llama-2-7b-chat-hf_generation_time': generation duration *in seconds* for the answer of meta-llama/Llama-2-7b-chat-hf model\n+ 'oasst-sft-4-pythia-12b_output': output generation of OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 model\n+ 'oasst-sft-4-pythia-12b_generation_time': generation duration *in seconds* for the answer of OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5 model\n+ 'vicuna-7b-v1.5_output': output generation of lmsys/vicuna-7b-v1.5 model\n+ 'vicuna-7b-v1.5_generation_time': generation duration *in seconds* for the answer of lmsys/vicuna-7b-v1.5 model## Prompt Sources\n\nThe prompts are a subset of all prompts of the following datasets:\n\n+ OpenAssistant/oasst1: only english prompts with no previous conversation tree ('role = prompter' and 'parent_id = null')\n+ Anthropic/hh-rlhf: only the initial input of the *Human* as prompt\n+ tatsu-lab/alpaca: concatenated 'instruction' and 'input' to form prompt\n+ Dahoas/synthetic-instruct-gptj-pairwise: prompts from 'prompt' column" ]
c489203c8305055bb05c92ec8c7dbc730470c383
# Dataset Card for "mt_bench_judge" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
SUSTech/mt_bench_judge
[ "region:us" ]
2023-11-23T10:03:09+00:00
{"dataset_info": {"features": [{"name": "question_id", "dtype": "int64"}, {"name": "model", "dtype": "string"}, {"name": "conversation", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "turn", "dtype": "int64"}, {"name": "judge", "sequence": "string"}, {"name": "user_prompt", "dtype": "string"}, {"name": "judgment", "dtype": "string"}, {"name": "score", "dtype": "float64"}, {"name": "tstamp", "dtype": "float64"}, {"name": "category", "dtype": "string"}, {"name": "reference", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 4409406, "num_examples": 800}], "download_size": 949262, "dataset_size": 4409406}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T10:18:55+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mt_bench_judge" More Information needed
[ "# Dataset Card for \"mt_bench_judge\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mt_bench_judge\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mt_bench_judge\"\n\nMore Information needed" ]
ee120f24e3a6c074482277b6162048f81d4aacba
# Dataset Card for "instruct_out_bc_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tr416/instruct_out_bc_data
[ "region:us" ]
2023-11-23T10:25:38+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 29059508, "num_examples": 29581}], "download_size": 14969317, "dataset_size": 29059508}}
2023-11-23T10:25:40+00:00
[]
[]
TAGS #region-us
# Dataset Card for "instruct_out_bc_data" More Information needed
[ "# Dataset Card for \"instruct_out_bc_data\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"instruct_out_bc_data\"\n\nMore Information needed" ]
[ 6, 18 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"instruct_out_bc_data\"\n\nMore Information needed" ]
183945f3f189d1b32dfdebdb19ebb56e6f5300d8
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
AllanOuii/ikomia_doc_1
[ "task_categories:table-question-answering", "size_categories:n<1K", "language:en", "license:mit", "code", "region:us" ]
2023-11-23T10:54:38+00:00
{"language": ["en"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["table-question-answering"], "tags": ["code"]}
2023-11-24T14:15:36+00:00
[]
[ "en" ]
TAGS #task_categories-table-question-answering #size_categories-n<1K #language-English #license-mit #code #region-us
# Dataset Card for Dataset Name This dataset card aims to be a base template for new datasets. It has been generated using this raw template. ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#task_categories-table-question-answering #size_categories-n<1K #language-English #license-mit #code #region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ 41, 34, 4, 40, 29, 3, 4, 9, 6, 5, 7, 4, 7, 10, 9, 5, 9, 8, 10, 46, 8, 7, 10, 5 ]
[ "passage: TAGS\n#task_categories-table-question-answering #size_categories-n<1K #language-English #license-mit #code #region-us \n# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact" ]
b047db66a9df93d99e38fd6f2dfd38c2684005ec
# Dataset Card for "watches-plus-3D-views-dataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
rjaiswal/watches-plus-3D-views-dataset
[ "region:us" ]
2023-11-23T10:58:42+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16328707.0, "num_examples": 186}], "download_size": 16234485, "dataset_size": 16328707.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-30T13:11:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for "watches-plus-3D-views-dataset" More Information needed
[ "# Dataset Card for \"watches-plus-3D-views-dataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"watches-plus-3D-views-dataset\"\n\nMore Information needed" ]
[ 6, 22 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"watches-plus-3D-views-dataset\"\n\nMore Information needed" ]
d9f97c1fad37a3c19104cb2245190ac1225c99c6
# CRAB: Causal Reasoning Assessment Benchmark ## Dataset Details ## Dataset Creation ## Splits - Tasks ### Pairwise Causality Assessment ### Graded Causality Assessment ## Citation To cite 🦀 CRAB, please use: ``` @inproceedings{romanou2023crab, title={CRAB: Assessing the Strength of Causal Relationships Between Real-world Events}, author={Angelika Romanou and Syrielle Montariol and Debjit Paul and Leo Laugier and Karl Aberer and Antoine Bosselut}, year={2023}, eprint={2311.04284}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
angelika/CRAB
[ "task_categories:text-classification", "task_categories:multiple-choice", "size_categories:1K<n<10K", "language:en", "arxiv:2311.04284", "region:us" ]
2023-11-23T10:59:07+00:00
{"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["text-classification", "multiple-choice"]}
2023-11-26T16:09:58+00:00
[ "2311.04284" ]
[ "en" ]
TAGS #task_categories-text-classification #task_categories-multiple-choice #size_categories-1K<n<10K #language-English #arxiv-2311.04284 #region-us
# CRAB: Causal Reasoning Assessment Benchmark ## Dataset Details ## Dataset Creation ## Splits - Tasks ### Pairwise Causality Assessment ### Graded Causality Assessment To cite CRAB, please use:
[ "# CRAB: Causal Reasoning Assessment Benchmark", "## Dataset Details", "## Dataset Creation", "## Splits - Tasks", "### Pairwise Causality Assessment", "### Graded Causality Assessment\n\n\nTo cite CRAB, please use:" ]
[ "TAGS\n#task_categories-text-classification #task_categories-multiple-choice #size_categories-1K<n<10K #language-English #arxiv-2311.04284 #region-us \n", "# CRAB: Causal Reasoning Assessment Benchmark", "## Dataset Details", "## Dataset Creation", "## Splits - Tasks", "### Pairwise Causality Assessment", "### Graded Causality Assessment\n\n\nTo cite CRAB, please use:" ]
[ 55, 12, 4, 5, 6, 9, 16 ]
[ "passage: TAGS\n#task_categories-text-classification #task_categories-multiple-choice #size_categories-1K<n<10K #language-English #arxiv-2311.04284 #region-us \n# CRAB: Causal Reasoning Assessment Benchmark## Dataset Details## Dataset Creation## Splits - Tasks### Pairwise Causality Assessment### Graded Causality Assessment\n\n\nTo cite CRAB, please use:" ]
bd2f935760c1aeeba92daf1af73b7626bb60a453
# Dataset Card for "mc4_nl_sentences_test" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dennisc1/mc4_nl_sentences_test
[ "region:us" ]
2023-11-23T11:18:25+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 14701, "num_examples": 121}], "download_size": 14203, "dataset_size": 14701}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T11:18:28+00:00
[]
[]
TAGS #region-us
# Dataset Card for "mc4_nl_sentences_test" More Information needed
[ "# Dataset Card for \"mc4_nl_sentences_test\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"mc4_nl_sentences_test\"\n\nMore Information needed" ]
[ 6, 20 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"mc4_nl_sentences_test\"\n\nMore Information needed" ]
c73124008065ffc08e50ef2e12efe886c2b24587
# Dataset Card for "stack-exchange-preferences-20230914-clean-anonymization" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bigcode/stack-exchange-preferences-20230914-clean-anonymization
[ "region:us" ]
2023-11-23T11:36:10+00:00
{"dataset_info": {"features": [{"name": "qid", "dtype": "int64"}, {"name": "question", "dtype": "string"}, {"name": "answers", "list": [{"name": "answer_id", "dtype": "int64"}, {"name": "author", "dtype": "string"}, {"name": "author_id", "dtype": "int64"}, {"name": "author_profile", "dtype": "string"}, {"name": "pm_score", "dtype": "int64"}, {"name": "selected", "dtype": "bool"}, {"name": "text", "dtype": "string"}]}, {"name": "date", "dtype": "string"}, {"name": "metadata", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 37966876013, "num_examples": 10404628}], "download_size": 17879223994, "dataset_size": 37966876013}}
2023-11-23T11:59:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for "stack-exchange-preferences-20230914-clean-anonymization" More Information needed
[ "# Dataset Card for \"stack-exchange-preferences-20230914-clean-anonymization\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"stack-exchange-preferences-20230914-clean-anonymization\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"stack-exchange-preferences-20230914-clean-anonymization\"\n\nMore Information needed" ]
48aedd7a8ea08efc75cae3e81fbefa9720df0812
# Dataset Card for CML-TTS ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks](#supported-tasks) - [Languages](#languages) - [How to use](#how-to-use) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Data Statistics](#data-statistics) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [MultiLingual LibriSpeech ASR corpus](https://www.openslr.org/146/) - **Repository:** [CML-TTS-Dataset](https://github.com/freds0/CML-TTS-Dataset) - **Paper:** [CML-TTS A Multilingual Dataset for Speech Synthesis in Low-Resource Languages](https://arxiv.org/abs/2306.10097) ### Dataset Summary CML-TTS is a recursive acronym for CML-Multi-Lingual-TTS, a Text-to-Speech (TTS) dataset developed at the Center of Excellence in Artificial Intelligence (CEIA) of the Federal University of Goias (UFG). CML-TTS is a dataset comprising audiobooks sourced from the public domain books of Project Gutenberg, read by volunteers from the LibriVox project. The dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz. The data archives were restructured from the original ones from [OpenSLR](http://www.openslr.org/146) to make it easier to stream. ### Supported Tasks - `text-to-speech`, `text-to-audio`: The dataset can also be used to train a model for Text-To-Speech (TTS). ### Languages The dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz. ### How to use The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function. For example, to download the German config, simply specify the corresponding language config name (i.e., "german" for German): ```python from datasets import load_dataset mls = load_dataset("ylacombe/cml-tts", "german", split="train") ``` Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk. ```python from datasets import load_dataset mls = load_dataset("ylacombe/cml-tts", "german", split="train", streaming=True) print(next(iter(mls))) ``` #### *Bonus* You can create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed). **Local:** ```python from datasets import load_dataset from torch.utils.data.sampler import BatchSampler, RandomSampler mls = load_dataset("ylacombe/cml-tts", "german", split="train") batch_sampler = BatchSampler(RandomSampler(mls), batch_size=32, drop_last=False) dataloader = DataLoader(mls, batch_sampler=batch_sampler) ``` **Streaming:** ```python from datasets import load_dataset from torch.utils.data import DataLoader mls = load_dataset("ylacombe/cml-tts", "german", split="train", streaming=True) dataloader = DataLoader(mls, batch_size=32) ``` To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets). ## Dataset Structure ### Data Instances A typical data point comprises the path to the audio file, usually called `file` and its transcription, called `text`. Some additional information about the speaker and the passage which contains the transcription is provided. ``` {'audio': {'path': '6892_8912_000729.wav', 'array': array([-1.52587891e-...7344e-05]), 'sampling_rate': 24000}, 'wav_filesize': 601964, 'text': 'Proszę pana, tu pano... zdziwiony', 'transcript_wav2vec': 'proszę pana tu panow... zdziwiony', 'levenshtein': 0.96045197740113, 'duration': 13.648979591836737, 'num_words': 29, 'speaker_id': 6892} ``` ### Data Fields - audio: A dictionary containing the audio filename, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`. - text: the transcription of the audio file. - speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples. - transcript_wav2vec: the transcription of the audio file using the wav2vec model. Has been used to curate the dataset. - wav_filesize: The size of the audio waveform file. Has been used to curate the dataset. - levenshtein: The [Levenshtein distance](https://en.wikipedia.org/wiki/Levenshtein_distance) between the wav2vec transcription and the original transcription. Has been used to curate the dataset. - duration: The duration of the audio in seconds. - num_words: The number of words of the transcription. ### Data Splits | # Samples | Train | Dev | Test | |------------|--------|------|------| | german | 608296 | 5314 | 5466 | | dutch | 309785 | 4834 | 4570 | | french | 107598 | 3739 | 3763 | | spanish | 168524 | 3148 | 3080 | | italian | 50345 | 1765 | 1835 | | portuguese | 34265 | 1134 | 1297 | | polish | 18719 | 853 | 814 | ### Data Statistics | Language | Duration (Train) | Duration (Test) | Duration (Dev) | Speakers (Train) | Speakers (Test) | Speakers (Dev) | |------------|-------------------|------------------|----------------|------------------|-----------------|----------------| | | M | F | M | F | M | F | M | F | M | F | M | F | | Dutch | 482.82 | 162.17 | 2.46 | 1.29 | 2.24 | 1.67 | 8 | 27 | 3 | 3 | 2 | 4 | | French | 260.08 | 24.04 | 2.48 | 3.55 | 3.31 | 2.72 | 25 | 20 | 8 | 9 | 10 | 8 | | German | 1128.96 | 436.64 | 3.75 | 5.27 | 4.31 | 5.03 | 78 | 90 | 13 | 17 | 13 | 15 | | Italian | 73.78 | 57.51 | 1.47 | 0.85 | 0.40 | 1.52 | 23 | 38 | 5 | 5 | 4 | 6 | | Polish | 30.61 | 8.32 | 0.70 | 0.90 | 0.56 | 0.80 | 4 | 4 | 2 | 2 | 2 | 2 | | Portuguese | 23.14 | 44.81 | 0.28 | 0.24 | 0.68 | 0.20 | 20 | 10 | 5 | 4 | 6 | 3 | | Spanish | 279.15 | 164.08 | 2.77 | 2.06 | 3.40 | 2.34 | 35 | 42 | 10 | 8 | 11 | 9 | | Total | 3,176.13| | 28.11 | | 29.19 | | 424 | | 94 | | 95 | | ## Dataset Creation ### Curation Rationale [Needs More Information] ### Source Data #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process [Needs More Information] #### Who are the annotators? [Needs More Information] ### Personal and Sensitive Information The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset. ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators [Needs More Information] ### Licensing Information Public Domain, Creative Commons Attribution 4.0 International Public License ([CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode)) ### Citation Information ``` @misc{oliveira2023cmltts, title={CML-TTS A Multilingual Dataset for Speech Synthesis in Low-Resource Languages}, author={Frederico S. Oliveira and Edresson Casanova and Arnaldo Cândido Júnior and Anderson S. Soares and Arlindo R. Galvão Filho}, year={2023}, eprint={2306.10097}, archivePrefix={arXiv}, primaryClass={eess.AS} } ``` ### Contributions Thanks to [@ylacombe](https://github.com/ylacombe) for adding this dataset.
ylacombe/cml-tts
[ "task_categories:text-to-speech", "task_categories:text-to-audio", "size_categories:1M<n<10M", "language:nl", "language:fr", "language:de", "language:it", "language:pl", "language:pt", "language:es", "license:cc-by-4.0", "arxiv:2306.10097", "region:us" ]
2023-11-23T12:01:49+00:00
{"language": ["nl", "fr", "de", "it", "pl", "pt", "es"], "license": "cc-by-4.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-to-speech", "text-to-audio"], "pretty_name": "CML-TTS", "dataset_info": [{"config_name": "dutch", "features": [{"name": "audio", "dtype": "audio"}, {"name": "wav_filesize", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "transcript_wav2vec", "dtype": "string"}, {"name": "levenshtein", "dtype": "float64"}, {"name": "duration", "dtype": "float64"}, {"name": "num_words", "dtype": "int64"}, {"name": "speaker_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 186374683541.98, "num_examples": 309785}, {"name": "dev", "num_bytes": 2912063172.928, "num_examples": 4834}, {"name": "test", "num_bytes": 2757891736.78, "num_examples": 4570}], "download_size": 132987704971, "dataset_size": 192044638451.68802}, {"config_name": "french", "features": [{"name": "audio", "dtype": "audio"}, {"name": "wav_filesize", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "transcript_wav2vec", "dtype": "string"}, {"name": "levenshtein", "dtype": "float64"}, {"name": "duration", "dtype": "float64"}, {"name": "num_words", "dtype": "int64"}, {"name": "speaker_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 64984002840.768, "num_examples": 107598}, {"name": "dev", "num_bytes": 2257393207.796, "num_examples": 3739}, {"name": "test", "num_bytes": 2281630546.306, "num_examples": 3763}], "download_size": 48345998335, "dataset_size": 69523026594.87}, {"config_name": "german", "features": [{"name": "audio", "dtype": "audio"}, {"name": "wav_filesize", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "transcript_wav2vec", "dtype": "string"}, {"name": "levenshtein", "dtype": "float64"}, {"name": "duration", "dtype": "float64"}, {"name": "num_words", "dtype": "int64"}, {"name": "speaker_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 369052038020.872, "num_examples": 608296}, {"name": "dev", "num_bytes": 3197115278.604, "num_examples": 5314}, {"name": "test", "num_bytes": 3288183839.092, "num_examples": 5466}], "download_size": 280438261836, "dataset_size": 375537337138.568}, {"config_name": "italian", "features": [{"name": "audio", "dtype": "audio"}, {"name": "wav_filesize", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "transcript_wav2vec", "dtype": "string"}, {"name": "levenshtein", "dtype": "float64"}, {"name": "duration", "dtype": "float64"}, {"name": "num_words", "dtype": "int64"}, {"name": "speaker_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 30242801015.92, "num_examples": 50345}, {"name": "dev", "num_bytes": 938644924.81, "num_examples": 1765}, {"name": "test", "num_bytes": 979116355.51, "num_examples": 1835}], "download_size": 21996805791, "dataset_size": 32160562296.239998}, {"config_name": "polish", "features": [{"name": "audio", "dtype": "audio"}, {"name": "wav_filesize", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "transcript_wav2vec", "dtype": "string"}, {"name": "levenshtein", "dtype": "float64"}, {"name": "duration", "dtype": "float64"}, {"name": "num_words", "dtype": "int64"}, {"name": "speaker_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 11127461686.356, "num_examples": 18719}, {"name": "dev", "num_bytes": 356048249, "num_examples": 853}, {"name": "test", "num_bytes": 367796887, "num_examples": 814}], "download_size": 8114633186, "dataset_size": 11851306822.356}, {"config_name": "portuguese", "features": [{"name": "audio", "dtype": "audio"}, {"name": "wav_filesize", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "transcript_wav2vec", "dtype": "string"}, {"name": "levenshtein", "dtype": "float64"}, {"name": "duration", "dtype": "float64"}, {"name": "num_words", "dtype": "int64"}, {"name": "speaker_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 20722423371.0, "num_examples": 34265}, {"name": "dev", "num_bytes": 622824524.224, "num_examples": 1134}, {"name": "test", "num_bytes": 673141068.9, "num_examples": 1297}], "download_size": 14421097659, "dataset_size": 22018388964.124}, {"config_name": "spanish", "features": [{"name": "audio", "dtype": "audio"}, {"name": "wav_filesize", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "transcript_wav2vec", "dtype": "string"}, {"name": "levenshtein", "dtype": "float64"}, {"name": "duration", "dtype": "float64"}, {"name": "num_words", "dtype": "int64"}, {"name": "speaker_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 101377452063.176, "num_examples": 168524}, {"name": "dev", "num_bytes": 1882729515.184, "num_examples": 3148}, {"name": "test", "num_bytes": 1851592818.0, "num_examples": 3080}], "download_size": 73687756096, "dataset_size": 105111774396.36}], "configs": [{"config_name": "dutch", "data_files": [{"split": "train", "path": "dutch/train-*"}, {"split": "dev", "path": "dutch/dev-*"}, {"split": "test", "path": "dutch/test-*"}]}, {"config_name": "french", "data_files": [{"split": "train", "path": "french/train-*"}, {"split": "dev", "path": "french/dev-*"}, {"split": "test", "path": "french/test-*"}]}, {"config_name": "german", "data_files": [{"split": "train", "path": "german/train-*"}, {"split": "dev", "path": "german/dev-*"}, {"split": "test", "path": "german/test-*"}]}, {"config_name": "italian", "data_files": [{"split": "train", "path": "italian/train-*"}, {"split": "dev", "path": "italian/dev-*"}, {"split": "test", "path": "italian/test-*"}]}, {"config_name": "polish", "data_files": [{"split": "train", "path": "polish/train-*"}, {"split": "dev", "path": "polish/dev-*"}, {"split": "test", "path": "polish/test-*"}]}, {"config_name": "portuguese", "data_files": [{"split": "train", "path": "portuguese/train-*"}, {"split": "dev", "path": "portuguese/dev-*"}, {"split": "test", "path": "portuguese/test-*"}]}, {"config_name": "spanish", "data_files": [{"split": "train", "path": "spanish/train-*"}, {"split": "dev", "path": "spanish/dev-*"}, {"split": "test", "path": "spanish/test-*"}]}]}
2023-11-24T14:48:29+00:00
[ "2306.10097" ]
[ "nl", "fr", "de", "it", "pl", "pt", "es" ]
TAGS #task_categories-text-to-speech #task_categories-text-to-audio #size_categories-1M<n<10M #language-Dutch #language-French #language-German #language-Italian #language-Polish #language-Portuguese #language-Spanish #license-cc-by-4.0 #arxiv-2306.10097 #region-us
Dataset Card for CML-TTS ======================== Table of Contents ----------------- * Dataset Description + Dataset Summary + Supported Tasks + Languages + How to use * Dataset Structure + Data Instances + Data Fields + Data Splits + Data Statistics * Dataset Creation + Curation Rationale + Source Data + Annotations + Personal and Sensitive Information * Considerations for Using the Data + Social Impact of Dataset + Discussion of Biases + Other Known Limitations * Additional Information + Dataset Curators + Licensing Information + Citation Information + Contributions Dataset Description ------------------- * Homepage: MultiLingual LibriSpeech ASR corpus * Repository: CML-TTS-Dataset * Paper: CML-TTS A Multilingual Dataset for Speech Synthesis in Low-Resource Languages ### Dataset Summary CML-TTS is a recursive acronym for CML-Multi-Lingual-TTS, a Text-to-Speech (TTS) dataset developed at the Center of Excellence in Artificial Intelligence (CEIA) of the Federal University of Goias (UFG). CML-TTS is a dataset comprising audiobooks sourced from the public domain books of Project Gutenberg, read by volunteers from the LibriVox project. The dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz. The data archives were restructured from the original ones from OpenSLR to make it easier to stream. ### Supported Tasks * 'text-to-speech', 'text-to-audio': The dataset can also be used to train a model for Text-To-Speech (TTS). ### Languages The dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz. ### How to use The 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load\_dataset' function. For example, to download the German config, simply specify the corresponding language config name (i.e., "german" for German): Using the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load\_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk. #### *Bonus* You can create a PyTorch dataloader directly with your own datasets (local/streamed). Local: Streaming: To find out more about loading and preparing audio datasets, head over to URL Dataset Structure ----------------- ### Data Instances A typical data point comprises the path to the audio file, usually called 'file' and its transcription, called 'text'. Some additional information about the speaker and the passage which contains the transcription is provided. ### Data Fields * audio: A dictionary containing the audio filename, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0]["audio"]' the audio file is automatically decoded and resampled to 'dataset.features["audio"].sampling\_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '"audio"' column, *i.e.* 'dataset[0]["audio"]' should always be preferred over 'dataset["audio"][0]'. * text: the transcription of the audio file. * speaker\_id: unique id of the speaker. The same speaker id can be found for multiple data samples. * transcript\_wav2vec: the transcription of the audio file using the wav2vec model. Has been used to curate the dataset. * wav\_filesize: The size of the audio waveform file. Has been used to curate the dataset. * levenshtein: The Levenshtein distance between the wav2vec transcription and the original transcription. Has been used to curate the dataset. * duration: The duration of the audio in seconds. * num\_words: The number of words of the transcription. ### Data Splits ### Data Statistics Dataset Creation ---------------- ### Curation Rationale ### Source Data #### Initial Data Collection and Normalization #### Who are the source language producers? ### Annotations #### Annotation process #### Who are the annotators? ### Personal and Sensitive Information The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset. Considerations for Using the Data --------------------------------- ### Social Impact of Dataset ### Discussion of Biases ### Other Known Limitations Additional Information ---------------------- ### Dataset Curators ### Licensing Information Public Domain, Creative Commons Attribution 4.0 International Public License (CC-BY-4.0) ### Contributions Thanks to @ylacombe for adding this dataset.
[ "### Dataset Summary\n\n\nCML-TTS is a recursive acronym for CML-Multi-Lingual-TTS, a Text-to-Speech (TTS) dataset developed at the Center of Excellence in Artificial Intelligence (CEIA) of the Federal University of Goias (UFG).\nCML-TTS is a dataset comprising audiobooks sourced from the public domain books of Project Gutenberg, read by volunteers from the LibriVox project. The dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz.\n\n\nThe data archives were restructured from the original ones from OpenSLR to make it easier to stream.", "### Supported Tasks\n\n\n* 'text-to-speech', 'text-to-audio': The dataset can also be used to train a model for Text-To-Speech (TTS).", "### Languages\n\n\nThe dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz.", "### How to use\n\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load\\_dataset' function.\n\n\nFor example, to download the German config, simply specify the corresponding language config name (i.e., \"german\" for German):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load\\_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.", "#### *Bonus*\n\n\nYou can create a PyTorch dataloader directly with your own datasets (local/streamed).\n\n\nLocal:\n\n\nStreaming:\n\n\nTo find out more about loading and preparing audio datasets, head over to URL\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA typical data point comprises the path to the audio file, usually called 'file' and its transcription, called 'text'. Some additional information about the speaker and the passage which contains the transcription is provided.", "### Data Fields\n\n\n* audio: A dictionary containing the audio filename, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling\\_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n* text: the transcription of the audio file.\n* speaker\\_id: unique id of the speaker. The same speaker id can be found for multiple data samples.\n* transcript\\_wav2vec: the transcription of the audio file using the wav2vec model. Has been used to curate the dataset.\n* wav\\_filesize: The size of the audio waveform file. Has been used to curate the dataset.\n* levenshtein: The Levenshtein distance between the wav2vec transcription and the original transcription. Has been used to curate the dataset.\n* duration: The duration of the audio in seconds.\n* num\\_words: The number of words of the transcription.", "### Data Splits", "### Data Statistics\n\n\n\nDataset Creation\n----------------", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information\n\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.\n\n\nConsiderations for Using the Data\n---------------------------------", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations\n\n\nAdditional Information\n----------------------", "### Dataset Curators", "### Licensing Information\n\n\nPublic Domain, Creative Commons Attribution 4.0 International Public License (CC-BY-4.0)", "### Contributions\n\n\nThanks to @ylacombe for adding this dataset." ]
[ "TAGS\n#task_categories-text-to-speech #task_categories-text-to-audio #size_categories-1M<n<10M #language-Dutch #language-French #language-German #language-Italian #language-Polish #language-Portuguese #language-Spanish #license-cc-by-4.0 #arxiv-2306.10097 #region-us \n", "### Dataset Summary\n\n\nCML-TTS is a recursive acronym for CML-Multi-Lingual-TTS, a Text-to-Speech (TTS) dataset developed at the Center of Excellence in Artificial Intelligence (CEIA) of the Federal University of Goias (UFG).\nCML-TTS is a dataset comprising audiobooks sourced from the public domain books of Project Gutenberg, read by volunteers from the LibriVox project. The dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz.\n\n\nThe data archives were restructured from the original ones from OpenSLR to make it easier to stream.", "### Supported Tasks\n\n\n* 'text-to-speech', 'text-to-audio': The dataset can also be used to train a model for Text-To-Speech (TTS).", "### Languages\n\n\nThe dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz.", "### How to use\n\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load\\_dataset' function.\n\n\nFor example, to download the German config, simply specify the corresponding language config name (i.e., \"german\" for German):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load\\_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.", "#### *Bonus*\n\n\nYou can create a PyTorch dataloader directly with your own datasets (local/streamed).\n\n\nLocal:\n\n\nStreaming:\n\n\nTo find out more about loading and preparing audio datasets, head over to URL\n\n\nDataset Structure\n-----------------", "### Data Instances\n\n\nA typical data point comprises the path to the audio file, usually called 'file' and its transcription, called 'text'. Some additional information about the speaker and the passage which contains the transcription is provided.", "### Data Fields\n\n\n* audio: A dictionary containing the audio filename, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling\\_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n* text: the transcription of the audio file.\n* speaker\\_id: unique id of the speaker. The same speaker id can be found for multiple data samples.\n* transcript\\_wav2vec: the transcription of the audio file using the wav2vec model. Has been used to curate the dataset.\n* wav\\_filesize: The size of the audio waveform file. Has been used to curate the dataset.\n* levenshtein: The Levenshtein distance between the wav2vec transcription and the original transcription. Has been used to curate the dataset.\n* duration: The duration of the audio in seconds.\n* num\\_words: The number of words of the transcription.", "### Data Splits", "### Data Statistics\n\n\n\nDataset Creation\n----------------", "### Curation Rationale", "### Source Data", "#### Initial Data Collection and Normalization", "#### Who are the source language producers?", "### Annotations", "#### Annotation process", "#### Who are the annotators?", "### Personal and Sensitive Information\n\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.\n\n\nConsiderations for Using the Data\n---------------------------------", "### Social Impact of Dataset", "### Discussion of Biases", "### Other Known Limitations\n\n\nAdditional Information\n----------------------", "### Dataset Curators", "### Licensing Information\n\n\nPublic Domain, Creative Commons Attribution 4.0 International Public License (CC-BY-4.0)", "### Contributions\n\n\nThanks to @ylacombe for adding this dataset." ]
[ 98, 166, 48, 41, 168, 59, 52, 337, 5, 11, 7, 4, 10, 10, 5, 5, 9, 50, 7, 8, 14, 6, 23, 17 ]
[ "passage: TAGS\n#task_categories-text-to-speech #task_categories-text-to-audio #size_categories-1M<n<10M #language-Dutch #language-French #language-German #language-Italian #language-Polish #language-Portuguese #language-Spanish #license-cc-by-4.0 #arxiv-2306.10097 #region-us \n### Dataset Summary\n\n\nCML-TTS is a recursive acronym for CML-Multi-Lingual-TTS, a Text-to-Speech (TTS) dataset developed at the Center of Excellence in Artificial Intelligence (CEIA) of the Federal University of Goias (UFG).\nCML-TTS is a dataset comprising audiobooks sourced from the public domain books of Project Gutenberg, read by volunteers from the LibriVox project. The dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz.\n\n\nThe data archives were restructured from the original ones from OpenSLR to make it easier to stream.### Supported Tasks\n\n\n* 'text-to-speech', 'text-to-audio': The dataset can also be used to train a model for Text-To-Speech (TTS).### Languages\n\n\nThe dataset includes recordings in Dutch, German, French, Italian, Polish, Portuguese, and Spanish, all at a sampling rate of 24kHz.", "passage: ### How to use\n\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load\\_dataset' function.\n\n\nFor example, to download the German config, simply specify the corresponding language config name (i.e., \"german\" for German):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load\\_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.#### *Bonus*\n\n\nYou can create a PyTorch dataloader directly with your own datasets (local/streamed).\n\n\nLocal:\n\n\nStreaming:\n\n\nTo find out more about loading and preparing audio datasets, head over to URL\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nA typical data point comprises the path to the audio file, usually called 'file' and its transcription, called 'text'. Some additional information about the speaker and the passage which contains the transcription is provided.### Data Fields\n\n\n* audio: A dictionary containing the audio filename, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling\\_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n* text: the transcription of the audio file.\n* speaker\\_id: unique id of the speaker. The same speaker id can be found for multiple data samples.\n* transcript\\_wav2vec: the transcription of the audio file using the wav2vec model. Has been used to curate the dataset.\n* wav\\_filesize: The size of the audio waveform file. Has been used to curate the dataset.\n* levenshtein: The Levenshtein distance between the wav2vec transcription and the original transcription. Has been used to curate the dataset.\n* duration: The duration of the audio in seconds.\n* num\\_words: The number of words of the transcription.### Data Splits### Data Statistics\n\n\n\nDataset Creation\n----------------### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process" ]
277a04f7e50e0755eff698d265730a4aa7a32d32
比较于上一个版本 ·1.新增了乘方和开方(二次方根)的题目 ·2.新增生成比例: 四则运算45% 一元一次方程30% 实际问题15% 乘方与开方10% ·3.新增四则运算变异:生成时有20%的几率在后面问“这个数(加,减,乘,除)a等于几?”(可堆叠) 联系方式:qq:2981447942 bilibili:一髅子Tick
TICK666/Basic-Math-Chinese-1M-V1.1
[ "task_categories:question-answering", "size_categories:1M<n<10M", "language:zh", "license:llama2", "region:us" ]
2023-11-23T12:06:16+00:00
{"language": ["zh"], "license": "llama2", "size_categories": ["1M<n<10M"], "task_categories": ["question-answering"], "pretty_name": "Basic-Math-Chinese-1M-V1.1"}
2023-11-23T12:19:53+00:00
[]
[ "zh" ]
TAGS #task_categories-question-answering #size_categories-1M<n<10M #language-Chinese #license-llama2 #region-us
比较于上一个版本 ·1.新增了乘方和开方(二次方根)的题目 ·2.新增生成比例: 四则运算45% 一元一次方程30% 实际问题15% 乘方与开方10% ·3.新增四则运算变异:生成时有20%的几率在后面问“这个数(加,减,乘,除)a等于几?”(可堆叠) 联系方式:qq:2981447942 bilibili:一髅子Tick
[]
[ "TAGS\n#task_categories-question-answering #size_categories-1M<n<10M #language-Chinese #license-llama2 #region-us \n" ]
[ 42 ]
[ "passage: TAGS\n#task_categories-question-answering #size_categories-1M<n<10M #language-Chinese #license-llama2 #region-us \n" ]
f5e5a1c4395010c4b36352298437e04b627b35c4
# Dataset Card for "general_query" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bh8648/general_query
[ "region:us" ]
2023-11-23T12:08:55+00:00
{"dataset_info": {"features": [{"name": "Instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 941023, "num_examples": 334}], "download_size": 398211, "dataset_size": 941023}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T12:09:06+00:00
[]
[]
TAGS #region-us
# Dataset Card for "general_query" More Information needed
[ "# Dataset Card for \"general_query\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"general_query\"\n\nMore Information needed" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"general_query\"\n\nMore Information needed" ]
8e6940d5e7ae09bebf5a6aac4e8cf9dc81156e17
# [doc] formats 1 This dataset contains files for a collection of supported formats, each of which is loaded in a different config (see the YAML field `configs`).
datasets-examples/doc-unsupported-1
[ "size_categories:n<1K", "region:us" ]
2023-11-23T12:11:51+00:00
{"size_categories": ["n<1K"], "configs": [{"config_name": "csv", "data_files": "*.csv"}, {"config_name": "tsv", "data_files": "*.tsv"}, {"config_name": "json", "data_files": "*.json"}, {"config_name": "jsonl", "data_files": "*.jsonl"}, {"config_name": "txt", "data_files": "*.txt"}]}
2023-11-23T12:47:05+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats 1 This dataset contains files for a collection of supported formats, each of which is loaded in a different config (see the YAML field 'configs').
[ "# [doc] formats 1\n\nThis dataset contains files for a collection of supported formats, each of which is loaded in a different config (see the YAML field 'configs')." ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats 1\n\nThis dataset contains files for a collection of supported formats, each of which is loaded in a different config (see the YAML field 'configs')." ]
[ 16, 44 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats 1\n\nThis dataset contains files for a collection of supported formats, each of which is loaded in a different config (see the YAML field 'configs')." ]
d762d125676c8d87b8559e016c8e6697261524ad
# Dataset Card for "stackexchange_data" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
loubnabnl/stackexchange_data
[ "region:us" ]
2023-11-23T12:42:56+00:00
{"dataset_info": {"features": [{"name": "qid", "dtype": "int64"}, {"name": "question", "dtype": "string"}, {"name": "answers", "list": [{"name": "answer_id", "dtype": "int64"}, {"name": "author", "dtype": "string"}, {"name": "author_id", "dtype": "int64"}, {"name": "author_profile", "dtype": "string"}, {"name": "pm_score", "dtype": "int64"}, {"name": "selected", "dtype": "bool"}, {"name": "text", "dtype": "string"}]}, {"name": "date", "dtype": "string"}, {"name": "metadata", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 23611705, "num_examples": 5000}], "download_size": 12340769, "dataset_size": 23611705}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T12:42:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "stackexchange_data" More Information needed
[ "# Dataset Card for \"stackexchange_data\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"stackexchange_data\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"stackexchange_data\"\n\nMore Information needed" ]
3d21dec20debc1d44d6ee88923f8bafba104b6e9
# [doc] formats - csv - 1 This dataset contains one csv file at the root: - [data.csv](./data.csv) ```csv kind,sound dog,woof cat,meow pokemon,pika human,hello ``` The YAML section of the README does not contain anything related to loading the data (only the size category metadata): ```yaml --- size_categories: - n<1K --- ```
datasets-examples/doc-formats-csv-1
[ "size_categories:n<1K", "region:us" ]
2023-11-23T13:04:55+00:00
{"size_categories": ["n<1K"]}
2023-11-23T14:14:43+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - csv - 1 This dataset contains one csv file at the root: - URL The YAML section of the README does not contain anything related to loading the data (only the size category metadata):
[ "# [doc] formats - csv - 1\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nThe YAML section of the README does not contain anything related to loading the data (only the size category metadata):" ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - csv - 1\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nThe YAML section of the README does not contain anything related to loading the data (only the size category metadata):" ]
[ 16, 53 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - csv - 1\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nThe YAML section of the README does not contain anything related to loading the data (only the size category metadata):" ]
7e5f1ec9fd8f323182e84d990819854bb72da478
We carefully collect 400 videos, each featuring dynamic scenes and rich in logical reasoning content. On average, these videos are 76.5 seconds long (5 FPS). The collection comprises 289 videos from VidOR, 55 videos from EpicKitchen, and 56 videos from Ego4D. Please `git clone` https://github.com/LilyDaytoy/OpenPVSG and organize your files according to the structure below. You can put the HF dataset in `data_zip` directory. ``` ├── assets ├── checkpoints ├── configs ├── data ├── data_zip │ ├── Ego4D │ │ ├── ego4d_masks.zip │ │ └── ego4d_videos.zip │ ├── EpicKitchen │ │ ├── epic_kitchen_masks.zip │ │ └── epic_kitchen_videos.zip │ ├── VidOR │ │ ├── vidor_masks.zip │ │ └── vidor_videos.zip │ └── pvsg.json ├── datasets ├── models ├── scripts ├── tools ├── utils ├── .gitignore ├── environment.yml └── README.md ``` Please run unzip_and_extract.py to unzip the files and extract frames from the videos. If you use zip, make sure to use `unzip -j xxx.zip` to remove junk paths. You should have your data directory looks like this: ``` data ├── ego4d │ ├── frames │ ├── masks │ └── videos ├── epic_kitchen │ ├── frames │ ├── masks │ └── videos ├── vidor │ ├── frames │ ├── masks │ └── videos └── pvsg.json ``` We suggest our users to play with `./notebooks/Visualize_Dataset.ipynb` to quickly get familiar with PVSG dataset. ## Citation For more information about the methods used in this dataset, please refer to the following paper: [Panoptic Video Scene Graph Generation](https://arxiv.org/abs/2310.15166), arXiv:2310.15166.
Jingkang/PVSG
[ "license:mit", "arxiv:2310.15166", "region:us" ]
2023-11-23T13:07:42+00:00
{"license": "mit"}
2023-11-29T17:52:11+00:00
[ "2310.15166" ]
[]
TAGS #license-mit #arxiv-2310.15166 #region-us
We carefully collect 400 videos, each featuring dynamic scenes and rich in logical reasoning content. On average, these videos are 76.5 seconds long (5 FPS). The collection comprises 289 videos from VidOR, 55 videos from EpicKitchen, and 56 videos from Ego4D. Please 'git clone' URL and organize your files according to the structure below. You can put the HF dataset in 'data_zip' directory. Please run unzip_and_extract.py to unzip the files and extract frames from the videos. If you use zip, make sure to use 'unzip -j URL' to remove junk paths. You should have your data directory looks like this: We suggest our users to play with './notebooks/Visualize_Dataset.ipynb' to quickly get familiar with PVSG dataset. For more information about the methods used in this dataset, please refer to the following paper: Panoptic Video Scene Graph Generation, arXiv:2310.15166.
[]
[ "TAGS\n#license-mit #arxiv-2310.15166 #region-us \n" ]
[ 20 ]
[ "passage: TAGS\n#license-mit #arxiv-2310.15166 #region-us \n" ]
1910dbe1354b1be7bf97d5660f7dd4696ffd4031
# [doc] formats - csv - 2 This dataset contains one csv file at the root: - [data.csv](./data.csv) ```csv kind,sound dog,woof cat,meow pokemon,pika human,hello ``` We define the separator as `","` in the YAML config, as well as the config name and the location of the file, with a glob expression: ```yaml --- configs: - config_name: default data_files: "*.csv" sep: "," size_categories: - n<1K --- ```
datasets-examples/doc-formats-csv-2
[ "size_categories:n<1K", "region:us" ]
2023-11-23T13:11:22+00:00
{"size_categories": ["n<1K"], "configs": [{"config_name": "default", "data_files": "*.csv", "sep": ","}]}
2023-11-23T14:16:21+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - csv - 2 This dataset contains one csv file at the root: - URL We define the separator as '","' in the YAML config, as well as the config name and the location of the file, with a glob expression:
[ "# [doc] formats - csv - 2\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nWe define the separator as '\",\"' in the YAML config, as well as the config name and the location of the file, with a glob expression:" ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - csv - 2\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nWe define the separator as '\",\"' in the YAML config, as well as the config name and the location of the file, with a glob expression:" ]
[ 16, 62 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - csv - 2\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nWe define the separator as '\",\"' in the YAML config, as well as the config name and the location of the file, with a glob expression:" ]
b7e951095aedb9d6737f917187eb39aca4e55aa6
# [doc] formats - tsv - 1 This dataset contains one tsv file at the root: - [data.tsv](./data.tsv) ```csv kind sound dog woof cat meow pokemon pika human hello ``` The YAML section of the README does not contain anything related to loading the data (only the size category metadata): ```yaml --- size_categories: - n<1K --- ``` The delimiter is automatically set to `"\t"` (tabulation) because of the `.tsv` extension of the data file.
datasets-examples/doc-formats-tsv-1
[ "size_categories:n<1K", "region:us" ]
2023-11-23T13:13:01+00:00
{"size_categories": ["n<1K"]}
2023-11-23T14:26:30+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - tsv - 1 This dataset contains one tsv file at the root: - URL The YAML section of the README does not contain anything related to loading the data (only the size category metadata): The delimiter is automatically set to '"\t"' (tabulation) because of the '.tsv' extension of the data file.
[ "# [doc] formats - tsv - 1\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nThe YAML section of the README does not contain anything related to loading the data (only the size category metadata):\n\n\n\nThe delimiter is automatically set to '\"\\t\"' (tabulation) because of the '.tsv' extension of the data file." ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - tsv - 1\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nThe YAML section of the README does not contain anything related to loading the data (only the size category metadata):\n\n\n\nThe delimiter is automatically set to '\"\\t\"' (tabulation) because of the '.tsv' extension of the data file." ]
[ 16, 85 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - tsv - 1\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nThe YAML section of the README does not contain anything related to loading the data (only the size category metadata):\n\n\n\nThe delimiter is automatically set to '\"\\t\"' (tabulation) because of the '.tsv' extension of the data file." ]
85077d4b454e4f0e5c80906844354c46edc7e4da
# [doc] formats - tsv - 2 This dataset contains one tsv file at the root: - [data.tsv](./data.tsv) ```csv kind sound dog woof cat meow pokemon pika human hello ``` We define the separator as `"\t"` (tabulation) in the YAML config, as well as the config name and the location of the file, with a glob expression: ```yaml configs: - config_name: default data_files: "*.tsv" sep: "\t" size_categories: - n<1K ```
datasets-examples/doc-formats-tsv-2
[ "size_categories:n<1K", "region:us" ]
2023-11-23T13:14:04+00:00
{"size_categories": ["n<1K"], "configs": [{"config_name": "default", "data_files": "*.tsv", "sep": "\t"}]}
2023-11-23T14:28:12+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - tsv - 2 This dataset contains one tsv file at the root: - URL We define the separator as '"\t"' (tabulation) in the YAML config, as well as the config name and the location of the file, with a glob expression:
[ "# [doc] formats - tsv - 2\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nWe define the separator as '\"\\t\"' (tabulation) in the YAML config, as well as the config name and the location of the file, with a glob expression:" ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - tsv - 2\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nWe define the separator as '\"\\t\"' (tabulation) in the YAML config, as well as the config name and the location of the file, with a glob expression:" ]
[ 16, 68 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - tsv - 2\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nWe define the separator as '\"\\t\"' (tabulation) in the YAML config, as well as the config name and the location of the file, with a glob expression:" ]
efa44bf208725fffa67519fb8670c84790546f3f
# [doc] formats - jsonl - 1 This dataset contains one jsonl file at the root.
datasets-examples/doc-formats-jsonl-1
[ "size_categories:n<1K", "region:us" ]
2023-11-23T13:23:15+00:00
{"size_categories": ["n<1K"]}
2023-11-23T13:23:51+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - jsonl - 1 This dataset contains one jsonl file at the root.
[ "# [doc] formats - jsonl - 1\n\nThis dataset contains one jsonl file at the root." ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - jsonl - 1\n\nThis dataset contains one jsonl file at the root." ]
[ 16, 26 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - jsonl - 1\n\nThis dataset contains one jsonl file at the root." ]
65edeabfa512a5109bbbb2fe4fc26934bd3ba3ac
# [doc] formats - json - 1 This dataset contains one json file at the root. It's a list of rows, each of which is a dict of columns.
datasets-examples/doc-formats-json-1
[ "size_categories:n<1K", "region:us" ]
2023-11-23T13:24:42+00:00
{"size_categories": ["n<1K"]}
2023-11-23T14:11:53+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - json - 1 This dataset contains one json file at the root. It's a list of rows, each of which is a dict of columns.
[ "# [doc] formats - json - 1\n\nThis dataset contains one json file at the root. It's a list of rows, each of which is a dict of columns." ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - json - 1\n\nThis dataset contains one json file at the root. It's a list of rows, each of which is a dict of columns." ]
[ 16, 46 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - json - 1\n\nThis dataset contains one json file at the root. It's a list of rows, each of which is a dict of columns." ]
b4681879a1c0fee93e266c8a07f9af771aae003f
# [doc] formats - txt - 1 This dataset contains one txt file at the root. It can only contain one column of strings.
severo/doc-formats-txt-1
[ "size_categories:n<1K", "region:us" ]
2023-11-23T13:26:59+00:00
{"size_categories": ["n<1K"]}
2023-11-23T13:28:00+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - txt - 1 This dataset contains one txt file at the root. It can only contain one column of strings.
[ "# [doc] formats - txt - 1\n\nThis dataset contains one txt file at the root. It can only contain one column of strings." ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - txt - 1\n\nThis dataset contains one txt file at the root. It can only contain one column of strings." ]
[ 16, 36 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - txt - 1\n\nThis dataset contains one txt file at the root. It can only contain one column of strings." ]
d2d486c0ad14f113b9b0c601d44b39193a430c0d
# Dataset Card for "c_llvm_O0_exebench_json_cleaned" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
zhangshuoming/c_llvm_O0_exebench_json_cleaned
[ "region:us" ]
2023-11-23T13:46:49+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 515356461, "num_examples": 566749}], "download_size": 154524123, "dataset_size": 515356461}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-25T12:22:58+00:00
[]
[]
TAGS #region-us
# Dataset Card for "c_llvm_O0_exebench_json_cleaned" More Information needed
[ "# Dataset Card for \"c_llvm_O0_exebench_json_cleaned\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"c_llvm_O0_exebench_json_cleaned\"\n\nMore Information needed" ]
[ 6, 29 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"c_llvm_O0_exebench_json_cleaned\"\n\nMore Information needed" ]
3a5f0900133e822ddf8908bc525d2ef105652156
# [doc] formats - csv - 3 This dataset contains one csv file at the root: - [data.csv](./data.csv) ```csv # ignored comment col1|col2 dog|woof cat|meow pokemon|pika human|hello ``` We define the config name in the YAML config, as well as the exact location of the file, the separator as `"|"`, the name of the columns, and the number of rows to ignore (the row #1 is a row of column headers, that will be replaced by the `names` option, and the row #0 is ignored). The reference for the options is the [documentation of pandas.read_csv()](https://pandas.pydata.org/docs/reference/api/pandas.read_csv.html). ```yaml --- configs: - config_name: default data_files: "data.csv" delimiter: "|" header: 1 names: ["kind", "sound"] size_categories: - n<1K --- ```
datasets-examples/doc-formats-csv-3
[ "size_categories:n<1K", "region:us" ]
2023-11-23T13:58:27+00:00
{"size_categories": ["n<1K"], "configs": [{"config_name": "default", "data_files": "data.csv", "delimiter": "|", "header": 1, "names": ["kind", "sound"]}]}
2023-11-23T14:20:20+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - csv - 3 This dataset contains one csv file at the root: - URL We define the config name in the YAML config, as well as the exact location of the file, the separator as '"|"', the name of the columns, and the number of rows to ignore (the row #1 is a row of column headers, that will be replaced by the 'names' option, and the row #0 is ignored). The reference for the options is the documentation of pandas.read_csv().
[ "# [doc] formats - csv - 3\n\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nWe define the config name in the YAML config, as well as the exact location of the file, the separator as '\"|\"', the name of the columns, and the number of rows to ignore (the row #1 is a row of column headers, that will be replaced by the 'names' option, and the row #0 is ignored). The reference for the options is the documentation of pandas.read_csv()." ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - csv - 3\n\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nWe define the config name in the YAML config, as well as the exact location of the file, the separator as '\"|\"', the name of the columns, and the number of rows to ignore (the row #1 is a row of column headers, that will be replaced by the 'names' option, and the row #0 is ignored). The reference for the options is the documentation of pandas.read_csv()." ]
[ 16, 133 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - csv - 3\n\n\nThis dataset contains one csv file at the root:\n\n- URL\n\n\n\nWe define the config name in the YAML config, as well as the exact location of the file, the separator as '\"|\"', the name of the columns, and the number of rows to ignore (the row #1 is a row of column headers, that will be replaced by the 'names' option, and the row #0 is ignored). The reference for the options is the documentation of pandas.read_csv()." ]
0f6acb11128b96d4027ce84a322c39522b62b1c8
# Dataset Card for "dataset-long-context-for-e5-finetune" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
wenzhuoliu/dataset-long-context-for-e5-finetune
[ "task_categories:text-retrieval", "language:fr", "region:us" ]
2023-11-23T13:59:17+00:00
{"language": ["fr"], "task_categories": ["text-retrieval"], "dataset_info": [{"config_name": "Testset", "features": [{"name": "query", "dtype": "string"}, {"name": "passage", "dtype": "string"}], "splits": [{"name": "philosophie", "num_bytes": 607946, "num_examples": 216}], "download_size": 352564, "dataset_size": 607946}, {"config_name": "testset", "features": [{"name": "query", "dtype": "string"}, {"name": "passage", "dtype": "string"}], "splits": [{"name": "llm_wikitexts", "num_bytes": 355087, "num_examples": 99}, {"name": "llm_wiki_single_long_document", "num_bytes": 367297, "num_examples": 148}], "download_size": 467511, "dataset_size": 722384}, {"config_name": "trainset", "features": [{"name": "query", "dtype": "string"}, {"name": "passage", "dtype": "string"}], "splits": [{"name": "wikihow_summary_passage", "num_bytes": 332619989, "num_examples": 111637}, {"name": "llm_generated_question_passage", "num_bytes": 74929318, "num_examples": 20000}, {"name": "qestion_passage_fr", "num_bytes": 18560943, "num_examples": 20535}], "download_size": 243783107, "dataset_size": 426110250}], "configs": [{"config_name": "Testset", "data_files": [{"split": "philosophie", "path": "Testset/philosophie-*"}]}, {"config_name": "testset", "data_files": [{"split": "llm_wikitexts", "path": "llm_eval/train-*"}, {"split": "single_document", "path": "llm_eval/single_document-*"}]}, {"config_name": "trainset", "data_files": [{"split": "wikihow_summary_passage", "path": "data/wikihow_summary_passage-*"}, {"split": "llm_generated_question_passage", "path": "data/llm_generated_question_passage-*"}, {"split": "qestion_passage_fr", "path": "data/qestion_passage_fr-*"}]}]}
2023-11-30T16:24:18+00:00
[]
[ "fr" ]
TAGS #task_categories-text-retrieval #language-French #region-us
# Dataset Card for "dataset-long-context-for-e5-finetune" More Information needed
[ "# Dataset Card for \"dataset-long-context-for-e5-finetune\"\n\nMore Information needed" ]
[ "TAGS\n#task_categories-text-retrieval #language-French #region-us \n", "# Dataset Card for \"dataset-long-context-for-e5-finetune\"\n\nMore Information needed" ]
[ 24, 24 ]
[ "passage: TAGS\n#task_categories-text-retrieval #language-French #region-us \n# Dataset Card for \"dataset-long-context-for-e5-finetune\"\n\nMore Information needed" ]
05bf04d126255128a5243b0f344f1eb5fe063310
# [doc] formats - tsv - 3 This dataset contains one tsv file at the root: - [data.tsv](./data.tsv) ```tsv dog woof cat meow pokemon pika human hello ``` We define the config name in the YAML config, the file's exact location, and the columns' name. As we provide the `names` option, but not the `header` one, the first row in the file is considered a row of values, not a row of column names. The delimiter is set to `"\t"` (tabulation) due to the file's extension. The reference for the options is the [documentation of pandas.read_csv()](https://pandas.pydata.org/docs/reference/api/pandas.read_csv.html). ```yaml --- configs: - config_name: default data_files: "data.tsv" names: ["kind", "sound"] size_categories: - n<1K --- ```
datasets-examples/doc-formats-tsv-3
[ "size_categories:n<1K", "region:us" ]
2023-11-23T14:01:15+00:00
{"size_categories": ["n<1K"], "configs": [{"config_name": "default", "data_files": "data.tsv", "names": ["kind", "sound"]}]}
2023-11-23T14:38:41+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - tsv - 3 This dataset contains one tsv file at the root: - URL We define the config name in the YAML config, the file's exact location, and the columns' name. As we provide the 'names' option, but not the 'header' one, the first row in the file is considered a row of values, not a row of column names. The delimiter is set to '"\t"' (tabulation) due to the file's extension. The reference for the options is the documentation of pandas.read_csv().
[ "# [doc] formats - tsv - 3\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nWe define the config name in the YAML config, the file's exact location, and the columns' name. As we provide the 'names' option, but not the 'header' one, the first row in the file is considered a row of values, not a row of column names. The delimiter is set to '\"\\t\"' (tabulation) due to the file's extension. The reference for the options is the documentation of pandas.read_csv()." ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - tsv - 3\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nWe define the config name in the YAML config, the file's exact location, and the columns' name. As we provide the 'names' option, but not the 'header' one, the first row in the file is considered a row of values, not a row of column names. The delimiter is set to '\"\\t\"' (tabulation) due to the file's extension. The reference for the options is the documentation of pandas.read_csv()." ]
[ 16, 141 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - tsv - 3\n\nThis dataset contains one tsv file at the root:\n\n- URL\n\n\n\nWe define the config name in the YAML config, the file's exact location, and the columns' name. As we provide the 'names' option, but not the 'header' one, the first row in the file is considered a row of values, not a row of column names. The delimiter is set to '\"\\t\"' (tabulation) due to the file's extension. The reference for the options is the documentation of pandas.read_csv()." ]
20a80ee7af39cdf1befd5955e3d6cd8e0b00f1c5
# Dataset Card for "test.v83i.coco-segmentation" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
xrizs/test.v83i.coco-segmentation
[ "region:us" ]
2023-11-23T14:10:05+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "val", "path": "data/val-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 815324785.5, "num_examples": 1814}, {"name": "val", "num_bytes": 205298969.0, "num_examples": 453}], "download_size": 1020036030, "dataset_size": 1020623754.5}}
2023-11-23T14:14:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for "URL-segmentation" More Information needed
[ "# Dataset Card for \"URL-segmentation\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"URL-segmentation\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"URL-segmentation\"\n\nMore Information needed" ]
a610c1020b5d3a56eae0d17889076c5936b26907
# [doc] formats - json - 2 This dataset contains one json file at the root. It's a dictionary of columns, each of which is a list of values.
severo/doc-formats-json-2
[ "size_categories:n<1K", "region:us" ]
2023-11-23T14:12:06+00:00
{"size_categories": ["n<1K"]}
2023-11-23T14:12:22+00:00
[]
[]
TAGS #size_categories-n<1K #region-us
# [doc] formats - json - 2 This dataset contains one json file at the root. It's a dictionary of columns, each of which is a list of values.
[ "# [doc] formats - json - 2\n\nThis dataset contains one json file at the root. It's a dictionary of columns, each of which is a list of values." ]
[ "TAGS\n#size_categories-n<1K #region-us \n", "# [doc] formats - json - 2\n\nThis dataset contains one json file at the root. It's a dictionary of columns, each of which is a list of values." ]
[ 16, 45 ]
[ "passage: TAGS\n#size_categories-n<1K #region-us \n# [doc] formats - json - 2\n\nThis dataset contains one json file at the root. It's a dictionary of columns, each of which is a list of values." ]
49125b7d64f7b39cc088fe93b4b0adc0373a57af
# 🤖 Language Model Test Prompts ## Overview 🌟 This repository contains a curated collection of test prompts designed for robust evaluation of language models. The prompts span a diverse array of topics and complexities, aiming to rigorously assess models' abilities in various contexts — from straightforward factual inquiries to intricate ethical scenarios. ## Dataset 📚 Our dataset is meticulously organized into several categories, each tailored to challenge different facets of language model proficiency: - **Complex Statements Completion** 🔍: Probing the model's understanding of intricate scientific and philosophical concepts. - **Open-ended Questions** ❓: Assessing the ability to handle broad, unconstrained queries. - **Creative Storytelling** 📖: Testing imaginative and narrative capabilities. - **Controversial Topic Statements** 🗣️: Evaluating the approach to sensitive and polarizing topics. - **Philosophical and Ethical Dilemmas** 💭: Delving into complex moral and philosophical territories. - **Technical Explanations and Definitions** 💡: Examining proficiency in technical and academic subjects. - **Historical Events Descriptions** 🌍: Understanding and recounting historical events. - **Hypothetical Situations** 🚀: Exploring responses to theoretical and speculative scenarios. These categories are crafted to push language models to their limits, providing valuable insights into their capabilities and areas for improvement. ## License 📄 The prompts in this repository were generated with the assistance of OpenAI's GPT-4. Usage of these prompts is subject to OpenAI's terms of service and licensing agreements. For more details, please refer to [OpenAI's Terms of Service](https://openai.com/terms/).
harpreetsahota/test-prompts
[ "size_categories:n<1K", "language:en", "region:us" ]
2023-11-23T14:21:20+00:00
{"language": ["en"], "size_categories": ["n<1K"], "pretty_name": "Sample Prompts for Generation"}
2023-11-29T18:42:10+00:00
[]
[ "en" ]
TAGS #size_categories-n<1K #language-English #region-us
# Language Model Test Prompts ## Overview This repository contains a curated collection of test prompts designed for robust evaluation of language models. The prompts span a diverse array of topics and complexities, aiming to rigorously assess models' abilities in various contexts — from straightforward factual inquiries to intricate ethical scenarios. ## Dataset Our dataset is meticulously organized into several categories, each tailored to challenge different facets of language model proficiency: - Complex Statements Completion : Probing the model's understanding of intricate scientific and philosophical concepts. - Open-ended Questions : Assessing the ability to handle broad, unconstrained queries. - Creative Storytelling : Testing imaginative and narrative capabilities. - Controversial Topic Statements ️: Evaluating the approach to sensitive and polarizing topics. - Philosophical and Ethical Dilemmas : Delving into complex moral and philosophical territories. - Technical Explanations and Definitions : Examining proficiency in technical and academic subjects. - Historical Events Descriptions : Understanding and recounting historical events. - Hypothetical Situations : Exploring responses to theoretical and speculative scenarios. These categories are crafted to push language models to their limits, providing valuable insights into their capabilities and areas for improvement. ## License The prompts in this repository were generated with the assistance of OpenAI's GPT-4. Usage of these prompts is subject to OpenAI's terms of service and licensing agreements. For more details, please refer to OpenAI's Terms of Service.
[ "# Language Model Test Prompts", "## Overview \nThis repository contains a curated collection of test prompts designed for robust evaluation of language models. The prompts span a diverse array of topics and complexities, aiming to rigorously assess models' abilities in various contexts — from straightforward factual inquiries to intricate ethical scenarios.", "## Dataset \nOur dataset is meticulously organized into several categories, each tailored to challenge different facets of language model proficiency:\n\n- Complex Statements Completion : Probing the model's understanding of intricate scientific and philosophical concepts.\n- Open-ended Questions : Assessing the ability to handle broad, unconstrained queries.\n- Creative Storytelling : Testing imaginative and narrative capabilities.\n- Controversial Topic Statements ️: Evaluating the approach to sensitive and polarizing topics.\n- Philosophical and Ethical Dilemmas : Delving into complex moral and philosophical territories.\n- Technical Explanations and Definitions : Examining proficiency in technical and academic subjects.\n- Historical Events Descriptions : Understanding and recounting historical events.\n- Hypothetical Situations : Exploring responses to theoretical and speculative scenarios.\n\nThese categories are crafted to push language models to their limits, providing valuable insights into their capabilities and areas for improvement.", "## License \nThe prompts in this repository were generated with the assistance of OpenAI's GPT-4. Usage of these prompts is subject to OpenAI's terms of service and licensing agreements. For more details, please refer to OpenAI's Terms of Service." ]
[ "TAGS\n#size_categories-n<1K #language-English #region-us \n", "# Language Model Test Prompts", "## Overview \nThis repository contains a curated collection of test prompts designed for robust evaluation of language models. The prompts span a diverse array of topics and complexities, aiming to rigorously assess models' abilities in various contexts — from straightforward factual inquiries to intricate ethical scenarios.", "## Dataset \nOur dataset is meticulously organized into several categories, each tailored to challenge different facets of language model proficiency:\n\n- Complex Statements Completion : Probing the model's understanding of intricate scientific and philosophical concepts.\n- Open-ended Questions : Assessing the ability to handle broad, unconstrained queries.\n- Creative Storytelling : Testing imaginative and narrative capabilities.\n- Controversial Topic Statements ️: Evaluating the approach to sensitive and polarizing topics.\n- Philosophical and Ethical Dilemmas : Delving into complex moral and philosophical territories.\n- Technical Explanations and Definitions : Examining proficiency in technical and academic subjects.\n- Historical Events Descriptions : Understanding and recounting historical events.\n- Hypothetical Situations : Exploring responses to theoretical and speculative scenarios.\n\nThese categories are crafted to push language models to their limits, providing valuable insights into their capabilities and areas for improvement.", "## License \nThe prompts in this repository were generated with the assistance of OpenAI's GPT-4. Usage of these prompts is subject to OpenAI's terms of service and licensing agreements. For more details, please refer to OpenAI's Terms of Service." ]
[ 20, 7, 74, 229, 63 ]
[ "passage: TAGS\n#size_categories-n<1K #language-English #region-us \n# Language Model Test Prompts## Overview \nThis repository contains a curated collection of test prompts designed for robust evaluation of language models. The prompts span a diverse array of topics and complexities, aiming to rigorously assess models' abilities in various contexts — from straightforward factual inquiries to intricate ethical scenarios.## Dataset \nOur dataset is meticulously organized into several categories, each tailored to challenge different facets of language model proficiency:\n\n- Complex Statements Completion : Probing the model's understanding of intricate scientific and philosophical concepts.\n- Open-ended Questions : Assessing the ability to handle broad, unconstrained queries.\n- Creative Storytelling : Testing imaginative and narrative capabilities.\n- Controversial Topic Statements ️: Evaluating the approach to sensitive and polarizing topics.\n- Philosophical and Ethical Dilemmas : Delving into complex moral and philosophical territories.\n- Technical Explanations and Definitions : Examining proficiency in technical and academic subjects.\n- Historical Events Descriptions : Understanding and recounting historical events.\n- Hypothetical Situations : Exploring responses to theoretical and speculative scenarios.\n\nThese categories are crafted to push language models to their limits, providing valuable insights into their capabilities and areas for improvement.## License \nThe prompts in this repository were generated with the assistance of OpenAI's GPT-4. Usage of these prompts is subject to OpenAI's terms of service and licensing agreements. For more details, please refer to OpenAI's Terms of Service." ]
189401cf92662b7c44b03fd9d47949ffc34e6def
# Dataset Card for "e13426e5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
result-kand2-sdxl-wuerst-karlo/e13426e5
[ "region:us" ]
2023-11-23T14:28:12+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 166, "num_examples": 10}], "download_size": 1307, "dataset_size": 166}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T14:28:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for "e13426e5" More Information needed
[ "# Dataset Card for \"e13426e5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"e13426e5\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"e13426e5\"\n\nMore Information needed" ]
842e33cfca9bcd75ed344e3a151174f8809c4526
# Dataset Card for "MetaMath-Vi" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
aops02/MetaMath-Vi
[ "region:us" ]
2023-11-23T14:37:49+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 72172855, "num_examples": 32972}], "download_size": 15771804, "dataset_size": 72172855}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-27T19:22:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "MetaMath-Vi" More Information needed
[ "# Dataset Card for \"MetaMath-Vi\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"MetaMath-Vi\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"MetaMath-Vi\"\n\nMore Information needed" ]
8da60c29f7b626f09c0892c2b2fd5b3a97330d7d
* 仿照[Yukang/LongAlpaca-12k](https://huggingface.co/datasets/Yukang/LongAlpaca-12k)设计的中文版LongAlpaca数据集,数据质量更高,类型更多,长度更长,且使用多轮对话形式。 * 可满足扩展模型context window至32k长度的指令微调训练。 * 所有数据的text均已经转化为chatml对话格式。 ## 数据集样本长度统计(使用qwen的tokenizer进行分词): input_ids长度为0-4096的样本数:1144;占比:0.12 \ input_ids长度为4096-8192的样本数:1103;占比:0.11 \ input_ids长度为8192-16384的样本数:2245;占比:0.24 \ input_ids长度为16384-24576的样本数:990;占比:0.10 \ input_ids长度为24576-32768的样本数:3661;占比:0.39 \ input_ids长度大于32768的样本数:196 \ 总样本数:9339 \ 平均长度:18292.95920334083
yuyijiong/LongAlpaca-Chinese
[ "language:zh", "license:cc-by-nc-4.0", "region:us" ]
2023-11-23T14:39:20+00:00
{"language": ["zh"], "license": "cc-by-nc-4.0"}
2023-11-29T06:18:02+00:00
[]
[ "zh" ]
TAGS #language-Chinese #license-cc-by-nc-4.0 #region-us
* 仿照Yukang/LongAlpaca-12k设计的中文版LongAlpaca数据集,数据质量更高,类型更多,长度更长,且使用多轮对话形式。 * 可满足扩展模型context window至32k长度的指令微调训练。 * 所有数据的text均已经转化为chatml对话格式。 ## 数据集样本长度统计(使用qwen的tokenizer进行分词): input_ids长度为0-4096的样本数:1144;占比:0.12 \ input_ids长度为4096-8192的样本数:1103;占比:0.11 \ input_ids长度为8192-16384的样本数:2245;占比:0.24 \ input_ids长度为16384-24576的样本数:990;占比:0.10 \ input_ids长度为24576-32768的样本数:3661;占比:0.39 \ input_ids长度大于32768的样本数:196 \ 总样本数:9339 \ 平均长度:18292.95920334083
[ "## 数据集样本长度统计(使用qwen的tokenizer进行分词):\ninput_ids长度为0-4096的样本数:1144;占比:0.12 \\\ninput_ids长度为4096-8192的样本数:1103;占比:0.11 \\\ninput_ids长度为8192-16384的样本数:2245;占比:0.24 \\\ninput_ids长度为16384-24576的样本数:990;占比:0.10 \\\ninput_ids长度为24576-32768的样本数:3661;占比:0.39 \\\ninput_ids长度大于32768的样本数:196 \\\n总样本数:9339 \\\n平均长度:18292.95920334083" ]
[ "TAGS\n#language-Chinese #license-cc-by-nc-4.0 #region-us \n", "## 数据集样本长度统计(使用qwen的tokenizer进行分词):\ninput_ids长度为0-4096的样本数:1144;占比:0.12 \\\ninput_ids长度为4096-8192的样本数:1103;占比:0.11 \\\ninput_ids长度为8192-16384的样本数:2245;占比:0.24 \\\ninput_ids长度为16384-24576的样本数:990;占比:0.10 \\\ninput_ids长度为24576-32768的样本数:3661;占比:0.39 \\\ninput_ids长度大于32768的样本数:196 \\\n总样本数:9339 \\\n平均长度:18292.95920334083" ]
[ 22, 177 ]
[ "passage: TAGS\n#language-Chinese #license-cc-by-nc-4.0 #region-us \n## 数据集样本长度统计(使用qwen的tokenizer进行分词):\ninput_ids长度为0-4096的样本数:1144;占比:0.12 \\\ninput_ids长度为4096-8192的样本数:1103;占比:0.11 \\\ninput_ids长度为8192-16384的样本数:2245;占比:0.24 \\\ninput_ids长度为16384-24576的样本数:990;占比:0.10 \\\ninput_ids长度为24576-32768的样本数:3661;占比:0.39 \\\ninput_ids长度大于32768的样本数:196 \\\n总样本数:9339 \\\n平均长度:18292.95920334083" ]
cedec5f0ba895205ff3227882b79c04b4e4e2914
# mtkinit/SuperNovyDataset Created from AIOD platform
mtkinit/mtkinit_SuperNovyDataset
[ "region:us" ]
2023-11-23T14:47:48+00:00
{"pretty_name": "mtkinit/SuperNovyDataset"}
2023-11-23T14:47:48+00:00
[]
[]
TAGS #region-us
# mtkinit/SuperNovyDataset Created from AIOD platform
[ "# mtkinit/SuperNovyDataset\nCreated from AIOD platform" ]
[ "TAGS\n#region-us \n", "# mtkinit/SuperNovyDataset\nCreated from AIOD platform" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# mtkinit/SuperNovyDataset\nCreated from AIOD platform" ]
a7177aaa9a4ae3d733e011ba09b078cb95fee8c1
# mtkinit/TestNewDataset Created from AIOD platform
mtkinit/mtkinit_TestNewDataset
[ "region:us" ]
2023-11-23T15:11:32+00:00
{"pretty_name": "mtkinit/TestNewDataset"}
2023-11-23T15:11:34+00:00
[]
[]
TAGS #region-us
# mtkinit/TestNewDataset Created from AIOD platform
[ "# mtkinit/TestNewDataset\nCreated from AIOD platform" ]
[ "TAGS\n#region-us \n", "# mtkinit/TestNewDataset\nCreated from AIOD platform" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# mtkinit/TestNewDataset\nCreated from AIOD platform" ]
3ad10ea1a047fa565963b41dd5c0129772670b65
# tiger laws 原json文件和它的向量库 json文件是来自 https://huggingface.co/datasets/TigerResearch/tigerbot-law-plugin 的法条 zip文件中是faiss向量库,embedding模型采用 https://huggingface.co/GanymedeNil/text2vec-large-chinese
Jinsns/tiger_laws
[ "region:us" ]
2023-11-23T15:11:52+00:00
{}
2023-11-23T15:20:35+00:00
[]
[]
TAGS #region-us
# tiger laws 原json文件和它的向量库 json文件是来自 URL 的法条 zip文件中是faiss向量库,embedding模型采用 URL
[ "# tiger laws 原json文件和它的向量库\n\njson文件是来自 URL 的法条\n\nzip文件中是faiss向量库,embedding模型采用 URL" ]
[ "TAGS\n#region-us \n", "# tiger laws 原json文件和它的向量库\n\njson文件是来自 URL 的法条\n\nzip文件中是faiss向量库,embedding模型采用 URL" ]
[ 6, 40 ]
[ "passage: TAGS\n#region-us \n# tiger laws 原json文件和它的向量库\n\njson文件是来自 URL 的法条\n\nzip文件中是faiss向量库,embedding模型采用 URL" ]
47d7475bec82caa09686bfe73872531f37013b49
# Dataset Card for "d8b81ca5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
result-kand2-sdxl-wuerst-karlo/d8b81ca5
[ "region:us" ]
2023-11-23T15:32:29+00:00
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 163, "num_examples": 10}], "download_size": 1299, "dataset_size": 163}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-23T15:32:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for "d8b81ca5" More Information needed
[ "# Dataset Card for \"d8b81ca5\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"d8b81ca5\"\n\nMore Information needed" ]
[ 6, 16 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"d8b81ca5\"\n\nMore Information needed" ]
64f244c00d662be64e231e5a2ed470dd41df78b8
# mtkinit/hellothere Created from AIOD platform
mtkinit/mtkinit_hellothere
[ "region:us" ]
2023-11-23T15:33:27+00:00
{"pretty_name": "mtkinit/hellothere"}
2023-11-23T15:33:28+00:00
[]
[]
TAGS #region-us
# mtkinit/hellothere Created from AIOD platform
[ "# mtkinit/hellothere\nCreated from AIOD platform" ]
[ "TAGS\n#region-us \n", "# mtkinit/hellothere\nCreated from AIOD platform" ]
[ 6, 14 ]
[ "passage: TAGS\n#region-us \n# mtkinit/hellothere\nCreated from AIOD platform" ]
3acd2ea693fd901fa287a71f3b87812e7d4b9f1f
# Dataset Card for "banner_generate" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
PaulTran/banner_generate
[ "region:us" ]
2023-11-23T15:33:45+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 84118313.344, "num_examples": 1362}], "download_size": 84092692, "dataset_size": 84118313.344}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2023-11-24T04:39:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for "banner_generate" More Information needed
[ "# Dataset Card for \"banner_generate\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"banner_generate\"\n\nMore Information needed" ]
[ 6, 15 ]
[ "passage: TAGS\n#region-us \n# Dataset Card for \"banner_generate\"\n\nMore Information needed" ]
a364094f36532f41979af6f0efe017008774bd35
### Dataset This dataset was created by consolidating information from various Portuguese Archives. We gathered data from these archives and subsequently performed manual annotation of each harvested corpus with Named Entities such as Person, Place, Date, Profession and Organization. The resulting dataset was formed by merging all the individual corpora into a unified corpus which we named "ner-archive-pt" and can be accessed at: http://ner.epl.di.uminho.pt/ ### Citation ```bibtex @Article{make4010003, AUTHOR = {Cunha, Luís Filipe and Ramalho, José Carlos}, TITLE = {NER in Archival Finding Aids: Extended}, JOURNAL = {Machine Learning and Knowledge Extraction}, VOLUME = {4}, YEAR = {2022}, NUMBER = {1}, PAGES = {42--65}, URL = {https://www.mdpi.com/2504-4990/4/1/3}, ISSN = {2504-4990}, ABSTRACT = {The amount of information preserved in Portuguese archives has increased over the years. These documents represent a national heritage of high importance, as they portray the country&rsquo;s history. Currently, most Portuguese archives have made their finding aids available to the public in digital format, however, these data do not have any annotation, so it is not always easy to analyze their content. In this work, Named Entity Recognition solutions were created that allow the identification and classification of several named entities from the archival finding aids. These named entities translate into crucial information about their context and, with high confidence results, they can be used for several purposes, for example, the creation of smart browsing tools by using entity linking and record linking techniques. In order to achieve high result scores, we annotated several corpora to train our own Machine Learning algorithms in this context domain. We also used different architectures, such as CNNs, LSTMs, and Maximum Entropy models. Finally, all the created datasets and ML models were made available to the public with a developed web platform, NER@DI.}, DOI = {10.3390/make4010003} } ```
lfcc/ner_archive_pt
[ "task_categories:token-classification", "size_categories:100K<n<1M", "language:pt", "region:us" ]
2023-11-23T16:11:37+00:00
{"language": ["pt"], "size_categories": ["100K<n<1M"], "task_categories": ["token-classification"]}
2023-11-23T16:43:28+00:00
[]
[ "pt" ]
TAGS #task_categories-token-classification #size_categories-100K<n<1M #language-Portuguese #region-us
### Dataset This dataset was created by consolidating information from various Portuguese Archives. We gathered data from these archives and subsequently performed manual annotation of each harvested corpus with Named Entities such as Person, Place, Date, Profession and Organization. The resulting dataset was formed by merging all the individual corpora into a unified corpus which we named "ner-archive-pt" and can be accessed at: URL
[ "### Dataset\nThis dataset was created by consolidating information from various Portuguese Archives. \nWe gathered data from these archives and subsequently performed manual annotation of each harvested corpus with Named Entities such as Person, Place, Date, Profession and Organization. \nThe resulting dataset was formed by merging all the individual corpora into a unified corpus which we named \"ner-archive-pt\" and can be accessed at: URL" ]
[ "TAGS\n#task_categories-token-classification #size_categories-100K<n<1M #language-Portuguese #region-us \n", "### Dataset\nThis dataset was created by consolidating information from various Portuguese Archives. \nWe gathered data from these archives and subsequently performed manual annotation of each harvested corpus with Named Entities such as Person, Place, Date, Profession and Organization. \nThe resulting dataset was formed by merging all the individual corpora into a unified corpus which we named \"ner-archive-pt\" and can be accessed at: URL" ]
[ 36, 99 ]
[ "passage: TAGS\n#task_categories-token-classification #size_categories-100K<n<1M #language-Portuguese #region-us \n### Dataset\nThis dataset was created by consolidating information from various Portuguese Archives. \nWe gathered data from these archives and subsequently performed manual annotation of each harvested corpus with Named Entities such as Person, Place, Date, Profession and Organization. \nThe resulting dataset was formed by merging all the individual corpora into a unified corpus which we named \"ner-archive-pt\" and can be accessed at: URL" ]